Failure to run with PyRDF in non-local mode

Hello,
i’m having problems running my jobs with PyRDF in non-local mode.
I use the current standard default swan configuration and connect to k8s also with its standard configuration.
My notebook also runs fine locally and the tutorial notebooks (e.g. the dimuon selection) also runs fine with spark.
However when I enable the usage of spark in my notebook it seems to get an empty RDataFrame and I’m unable to do basic operations (Define/Filter, etc.)
I put here an example

Any suggestion is appreciated.
thanks,
Pedro

Ups, sorry for the noise i should have looked into older threads - my bad.
The answer in the end is simple

"when running on spark one needs to prepend paths in eos with

root://eosuser.cern.ch/

so that xrootd is used"

thanks
pedro

Hi @psilva,
Indeed the k8s workers are unable to see the local filesystem of your CERNBox/SWAN session. Nonetheless I think this could be mentioned somewhere in PyRDF docs.
Don’t hesitate to ask any other question you should have!
Cheers,
Vincenzo