Kernel keeps dying on the same line

I am just trying to import root files and save it to Pandas DF, but kernel keeps dying.
This is code snippet:

import uproot as ur
temp = []
for name in temp_mc:
    file = ur.open(name)
    tree = file['Events']
    temp.append(tree.pandas.df())
df_signal = pd.concat(temp, join = "inner",ignore_index = True)

Dear Toni,

Did you install uproot on your CERNBox? It is not included in the lcg releases.
Do the files that you open with uproot reside in the same directory as the notebook?

Hi, yes I did, I installed uproot, and uploaded root files in CERNBox.

temp_mc =
temp_mc.append(path_to_root_file_1)
temp_mc.append(path_to_root_file_2)
for name in temp_mc:
file = ur.open(name)
tree = file[‘Events’]
temp.append(tree.pandas.df())
df_signal = pd.concat(temp, join = “inner”,ignore_index = True)

And this block of code(reading signal data) execute without an error. And then, when I try to read another data(background) the same way as above, the kernel dies.

My guess is that server kills my processes because they are using too much memory.

Yes, that could be the reason indeed. How much memory do you ask for when you start your SWAN session? If you go to https://swan006.cern.ch you can ask for up to 16 GB, but that is the absolute limit right now.

Thank you very much. This actually solves my problem. I only had access to 10GB of memory.
This root files were only few GB of data, so I should find better way for reading larger root files and manage memory and cache in uproot.

Dear Toni,
I also use uproot on SWAN and I found at the time this presentation
https://indico.cern.ch/event/686641/contributions/2894906/attachments/1606247/2548596/pivarski-uproot.pdf
It may be of interest to you, especially the "iterate " part…

Best regards,
Brian

Dear Toni,
You can also try ROOT’s RDataFrame, here is a tutorial of how to read and process a ROOT tree/ntuple, then wrap it into a pandas dataframe: