I am just trying to import root files and save it to Pandas DF, but kernel keeps dying.
This is code snippet:
import uproot as ur
temp = []
for name in temp_mc:
file = ur.open(name)
tree = file['Events']
temp.append(tree.pandas.df())
df_signal = pd.concat(temp, join = "inner",ignore_index = True)
Did you install uproot on your CERNBox? It is not included in the lcg releases.
Do the files that you open with uproot reside in the same directory as the notebook?
Hi, yes I did, I installed uproot, and uploaded root files in CERNBox.
temp_mc =
temp_mc.append(path_to_root_file_1)
temp_mc.append(path_to_root_file_2)
for name in temp_mc:
file = ur.open(name)
tree = file[‘Events’]
temp.append(tree.pandas.df())
df_signal = pd.concat(temp, join = “inner”,ignore_index = True)
And this block of code(reading signal data) execute without an error. And then, when I try to read another data(background) the same way as above, the kernel dies.
My guess is that server kills my processes because they are using too much memory.
Yes, that could be the reason indeed. How much memory do you ask for when you start your SWAN session? If you go to https://swan006.cern.ch you can ask for up to 16 GB, but that is the absolute limit right now.
Thank you very much. This actually solves my problem. I only had access to 10GB of memory.
This root files were only few GB of data, so I should find better way for reading larger root files and manage memory and cache in uproot.