Dear Enric, Dear SWAN/Spark team,
I would like to share here a small issue which was not present last week in the PyRDF module which I am using. This module needs for now the Bleeding edge software stack to have access to its 0.2.0 version, so it might be related to some “instabilities” (?). Here is the bug:
import ROOT, sys sys.path.insert(0,'/eos/home-v/vbrian/SWAN_projects/dvcs_project/Tools/Modules/PyRDF')import PyRDF sc.addPyFile("Tools/Modules/PyRDF.zip") # my own setup import PyRDF PyRDF.use('spark') r = PyRDF.RDataFrame(10) r = r.Define("weight", "2.") #r.AsNumpy() print(r.Count().GetValue()) print(r.Sum("weight").GetValue())
This works using the “local” backend ; I put here 3 RDataFrames methods which worked last week and not anymore. This can be linked to the usage of an old PyRDF version, but I use the one from github and followed the indications in https://github.com/JavierCVilla/PyRDF/tree/master/demos (also setuping the spark connector)
I may have forgotten something, but since it worked last Friday, I have to admit that I do not understand what is going on on my side… First how can I check that I really use bleeding edge software stack and that it does not redirect me to LCG 96 ?