Passing sparkcontext variable to execution scripts

Dear Experts,

I have been using RDataFrame with Spark Clusters and currently have no issues using a single jupyter notebook. Now I want to convert my local scripts with RDataFrame without converting all scripts to Jupyter Notebooks. (Keep most of the scripts as python files and execute the script from the jupyter notebook.)
For example, there is a script called “” which uses the RDataFrame with spark clusters. I can execute this python file from a notebook as

import subprocess["python",""])

The current problem is that the code below cannot be executed correctly since it cannot find the SparkContext provided by SWAN.

import ROOT
RDataFrame = ROOT.RDF.Experimental.Distributed.Spark.RDataFrame
rdf = RDataFrame(tchain, sparkcontext = sc)

Is there a way to pass the sparkcontext to execution files (or import modules) to keep the structure of my local scripts as much as possible ?

1 Like