Use SWAN with a custom software stack

Hello,

I am working on the ATLAS experiment where we have recently begun deploying custom builds of analysis software that are independent of the LCG releases. Specifically they include our own build of, among other things, python and ROOT (this may even be a custom version of ROOT with unofficial modifications).

Users can setup the software by sourcing a setup script, after which they have access to ROOT and our build of python and so on.

My question here is: would it be possible to be able to use SWAN on top of these releases? What I am imagining here is having the option to specify the path to a setup script that would be sourced by SWAN before it spawns the server instead of it, presumably, setting up an LCG release?

Having the ability to do this would be incredibly useful as it would allow us to start providing notebooks as documentation for how to use the software bundled in the release.

Many thanks
Will

Hello Will,

I have a couple of answers for your request. The first one and applicable right now, is that you can configure you own software environment / packages on top of the LCG releases. This is done via an environment script that you can select in the form when starting your session. Here is some documentation on how to use such a script to modify your PYTHONPATH variable:

But you could also have a user env script that sources your setup script on CVMFS. You can easily try this and see if there’s some problem when setting this environment on top of an LCG release (in other words, what would happen in practice is that the LCG setup.sh is sourced first and then yours). Also, where exactly in CVMFS are those builds located? Is it in a repository that is already available (mounted) in SWAN or we would need to add new one (the latter would not be a problem)?

Another answer to your request is that we are currently migrating our interface to JupyterLab. In that new interface, one of the options that we will offer to the users is to configure their environment with something that is not an LCG release, i.e. CMSSW or FCC stacks. Your builds could also be one of the options there. For this I’d like to understand how widespread will be the use of these builds, is it something that targets a particular research group or it will be generic for ATLAS?

Cheers,

Enric

Thanks Enric. I was aware that you can specify a post-stack setup script, but I think when I’ve tried to use this in the past it hasn’t allowed me to ‘override’ the ROOT version that the notebooks effectively use (ROOT would still come from the LCG release rather than from the custom software release). If you believe that it should be possible for the ROOT version to be overridden with this method though please say so and I can give it another go.

This is software that we intend to be used not only across the whole of ATLAS but absolutely we want to have a user base outside of ATLAS too.

At the moment the software is deployed to /cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis . Our users set up the software currently by running the command asetup StatAnalysis,0.0.2 … now, asetup is a command provided in the ATLAS software environment setup but in theory it should just be managing sourcing a few setup scripts. Specifically I believe it is doing:

source /cvmfs/sft.cern.ch/lcg/releases/gcc/11.2.0-8a51a/x86_64-centos7/setup.sh
source /cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/setup.sh

On what timescale will this new method (which can bypass lcg releases) be available?

Thanks
Will

If you believe that it should be possible for the ROOT version to be overridden with this method though please say so and I can give it another go.

I believe so, yes. The ATLAS build version needs to appear first in PATH, LD_LIBRARY_PATH and PYTHONPATH. Also, there’s a few ROOT-related variables that are defined by the LCG release and that need to be redefined (or even unsetting them might work too). For example, for the Bleeding edge stack:

ROOT_INCLUDE_PATH=/cvmfs/sft-nightlies.cern.ch/lcg/views/devswan/Tue/x86_64-centos7-gcc8-opt/include/Geant4:/cvmfs/sft.cern.ch/lcg/releases/jsonmcpp/3.10.5-f26c3/x86_64-centos7-gcc8-opt/include:/cvmfs/sft-nightlies.cern.ch/lcg/views/devswan/Tue/x86_64-centos7-gcc8-opt/src/cpp:/cvmfs/sft-nightlies.cern.ch/lcg/views/devswan/Tue/x86_64-centos7-gcc8-opt/include:/cvmfs/sft.cern.ch/lcg/releases/Python/3.9.12-9a1bc/x86_64-centos7-gcc8-opt/include/python3.9:/cvmfs/sft-nightlies.cern.ch/lcg/latest/R/4.1.2-f9ee4/x86_64-centos7-gcc8-opt/lib64/R/include:/cvmfs/sft-nightlies.cern.ch/lcg/latest/R/4.1.2-f9ee4/x86_64-centos7-gcc8-opt/lib64/R/library/RInside/include:/cvmfs/sft-nightlies.cern.ch/lcg/latest/R/4.1.2-f9ee4/x86_64-centos7-gcc8-opt/lib64/R/library/Rcpp/include
ROOTSYS=/cvmfs/sft-nightlies.cern.ch/lcg/nightlies/devswan/Tue/ROOT/HEAD/x86_64-centos7-gcc8-opt
CPPYY_BACKEND_LIBRARY=/cvmfs/sft-nightlies.cern.ch/lcg/nightlies/devswan/Tue/ROOT/HEAD/x86_64-centos7-gcc8-opt/lib/libcppyy_backend3_9

At the moment the software is deployed to /cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis

Ok good, that path is already available in SWAN.

Our users set up the software currently by running the command asetup StatAnalysis,0.0.2 … now, asetup is a command provided in the ATLAS software environment setup but in theory it should just be managing sourcing a few setup scripts. Specifically I believe it is doing:

Then you can either do asetup in the user env script of SWAN (with its full path on CVMFS) or the individual source commands, either should be fine as long as the needed env variables are set.

On what timescale will this new method (which can bypass lcg releases) be available?

This is scheduled to happen this year as part of the migration to JupyterLab. In a first phase it will be exposed only to a small subset of users for testing, if you are interested in being one of those please let us know.

Cheers,

Enric

ok I tried again to use a post-stack setup script to get the desired behaviour. So far I’ve not had much luck. It does seem you are right we have to make sure the LCG version of python (and ROOT) doesn’t get in the way of the version being setup from our software, which involves unsetting some environment variables. So far I have unset PYTHONHOME, PYTHONPATH, ROOTSYS, and CPPYY_BACKEND_LIBRARY before setting up our software.

Unfortunately when I try and start a ROOT notebook in SWAN with this setup it never manages to connect to the kernel. I took a guess that the kernel is obtained from running “root --notebook” in the background, and running that on the terminal unfortunately failed because jupyter modules are missing after are killed the PYTHONPATH. So I pip installed though and that did allow me to run that command albeit with lots of errors about missing extensions. But still the notebook page cannot connect to its kernel.

So in summary this isn’t proving as easy as I would like. FWIW this is my setup script I am calling:

unset PYTHONHOME
unset PYTHONPATH
unset ROOTSYS
unset CPPYY_BACKEND_LIBRARY
export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
asetup StatAnalysis,0.0.2

is there any further assistance you can provide to help try to make this work?

Hi,

Regarding logistics: instead of using an environment script, at this phase you can just test things from the SWAN terminal. In the terminal, the LCG release is already sourced, so you can just add stuff on top and see if it works. Once you find a setup that works you can use the env script and test notebooks as well.

What I would make sure is:

  1. You prepend to LD_LIBRARY_PATH, PATH and PYTHONPATH (but you don’t unset any of these).
  2. The ATLAS stack does not redefine any Jupyter-related variables (I understand the ATLAS stack does not provide Jupyter kernels?).
  3. I would unset: CPPY_BACKEND_LIBRARY, ROOT_INCLUDE_PATH and ROOTSYS.

After this I would try to run root from the terminal and see what happens (is it your ROOT that runs?) Then I would try python -c 'import ROOT; print(ROOT.__file__)'. Does that at least work before we go further?

Unfortunately following the above leads to segfaults. I note that the ‘python’ command still points at the SWAN/LCG version of python. So if I use “python3” I dont get a segfault but then get errors about

"
Python runtime state: core initialized
ModuleNotFoundError: No module named ‘encodings’
"

It is only once I unset the PYTHONPATH and PYTHONHOME, and use python3 as the command (which points at the python in the release) that things work:

bash-4.2$ python3 -c 'import ROOT; print(ROOT.__file__)'
/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/ROOT/__init__.py
bash-4.2$

Just for completeness, I posted an example of making an entirely custom Jupyter kernel, which can contain any software you like at Installing custom Jupyter kernels at SWAN startup. As highlighted in that post, it is a bit of a hack, so not suggesting it is the way to go, but might be useful as another data-point on how it can be done.

Hello,

This script works for me:

unset PYTHONHOME
unset ROOTSYS
unset CPPYY_BACKEND_LIBRARY
unset ROOT_INCLUDE_PATH
export ATLAS_LOCAL_ROOT_BASE=/cvmfs/atlas.cern.ch/repo/ATLASLocalRootBase
source ${ATLAS_LOCAL_ROOT_BASE}/user/atlasLocalSetup.sh
asetup StatAnalysis,0.0.2

and the python3 of the ATLAS build works with the ATLAS root as you managed to do too. Ok, this is already a start.

Now, notebooks are trickier because, as Phil explained, you need a working kernel. SWAN provides its own kernels, which come from CVMFS and are found by Jupyter via JUPYTER_PATH. The python one is here for LCG 101:

/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/share/jupyter/kernels/python3/kernel.json

and it uses the python3.9 of the LCG release. Therefore that one can’t work in your case, you need your own. You could install it in the ATLAS stack, make sure it uses your python and configure the Jupyter variables to point to it. An alternative is to install it first in your container / CERNBox (at least for testing purposes).

I know this is a bit of a hassle but, if you want to create a stack that works with SWAN (or with Jupyter, in general), I think that providing the kernels you want is the way to go. Then once we allow you to configure you environment with another stack, we’ll discover those kernels and make them available to you in the interface. Hope this makes sense!

Thanks. Since we are building ROOT, I tried adding:

export JUPYTER_PATH=$StatAnalysis_DIR/etc/notebook

to the setup script in the hope that would provide me with a ROOT kernel to use as a starting point. But when I try and start a ROOT notebook is still hands at the connecting to kernel step.

Should this have worked?

Thanks
Will

I checked and that can’t work because our Jupyter server only looks for kernels in:

/scratch/username/.local/share/jupyter/kernels

Actually, when I use the env script above I see that both the python and the ROOT kernel that are present in the folder above are configured to use the python of the ATLAS stack. That still does not work because both python stacks (ATLAS’ and LCG’s) are still mixed:

Traceback (most recent call last):
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/JupyROOT/kernel/rootkernel.py", line 22, in <module>
    from metakernel import MetaKernel
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/ROOT/_facade.py", line 153, in _importhook
    return _orig_ihook(name, *args, **kwds)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/metakernel/__init__.py", line 1, in <module>
    from ._metakernel import (
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/ROOT/_facade.py", line 153, in _importhook
    return _orig_ihook(name, *args, **kwds)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/metakernel/_metakernel.py", line 22, in <module>
    from ipykernel.kernelapp import IPKernelApp
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/ROOT/_facade.py", line 153, in _importhook
    return _orig_ihook(name, *args, **kwds)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/ipykernel/__init__.py", line 2, in <module>
    from .connect import *
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/ROOT/_facade.py", line 153, in _importhook
    return _orig_ihook(name, *args, **kwds)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/ipykernel/connect.py", line 12, in <module>
    import jupyter_client
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/ROOT/_facade.py", line 153, in _importhook
    return _orig_ihook(name, *args, **kwds)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/jupyter_client/__init__.py", line 4, in <module>
    from .connect import *
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/ROOT/_facade.py", line 153, in _importhook
    return _orig_ihook(name, *args, **kwds)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/jupyter_client/connect.py", line 21, in <module>
    import zmq
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/ROOT/_facade.py", line 153, in _importhook
    return _orig_ihook(name, *args, **kwds)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/zmq/__init__.py", line 103, in <module>
    from zmq import backend
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/ROOT/_facade.py", line 153, in _importhook
    return _orig_ihook(name, *args, **kwds)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/zmq/backend/__init__.py", line 32, in <module>
    raise original_error from None
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/zmq/backend/__init__.py", line 27, in <module>
    _ns = select_backend(first)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/zmq/backend/select.py", line 32, in select_backend
    mod = import_module(name)
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/zmq/backend/cython/__init__.py", line 6, in <module>
    from . import (
ImportError: cannot import name 'constants' from partially initialized module 'zmq.backend.cython' (most likely due to a circular import) (/cvmfs/sft.cern.ch/lcg/views/LCG_101swan/x86_64-centos7-gcc8-opt/lib/python3.9/site-packages/zmq/backend/cython/__init__.py)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/cvmfs/atlas.cern.ch/repo/sw/software/0.0/StatAnalysis/0.0.2/InstallArea/x86_64-centos7-gcc11-opt/lib/JupyROOT/kernel/rootkernel.py", line 25, in <module>
    raise Exception("Error: package metakernel not found.(install it running 'pip install metakernel')")
Exception: Error: package metakernel not found.(install it running 'pip install metakernel')

metakernel is a dependency of the ROOT C++ kernel that is not present in the ATLAS stack, but it is found in the LCG stack. metakernel uses zmq (via ipykernel) which ends up erroring out. A similar thing happens with the python kernel.

For this to work, we would probably need metakernel, ipykernel and zmq installed in the ATLAS stack for its python, to prevent going to the LCG stack. I am happy to help test this and make it work.

Note that when we will allow users to configure alternative stacks in SWAN for their sessions, the LCG environment will no longer be set underneath, so the alternative stack will be “on its own”. This means it will need to provide everything to make Jupyter kernels work. This is also a good reason to install the aforementioned packages in the ATLAS stack.

Thanks for debugging. So certainly I can add those three packages to the ATLAS stack. Would they all be included though if I just pip install ‘jupyter’ in the stack?

For what concerns SWAN, I think it would be enough to install ipykernel and metakernel. The Jupyter/JupyterLab server in SWAN runs with packages we install in the container image of the user session (they don’t come from the LCG releases), so in principle jupyter and jupyter-lab should not be necessary. But this needs to be tested.

Hello,

I’m revisiting this issue now. We added ipykernal and metakernel to the stack (6.15.0 and 0.29.0) respectively. Please update 0.0.2 to 0.0.3.
But it still seems like with the above setup script our ATLAS stack python is being interfered with by the SWAN python.

I see this if I do import metakernel in the python prompt, for example.

Do you know what we should try next here?
Thanks
Will

actually small follow up to this… I know you have said a few times not to unset PYTHONPATH just to prepend to it, but that seems to be the source of my issues here. If I unset PYTHONPATH as part of my setup script, then the imports work ok.

But the notebook still hangs at “Kernel starting” … so I don’t understand how to proceed, sorry :frowning:
Thanks
Will

so actually playing around again this afternoon, it does actually all seem to be work now!! I can start notebooks and they seem to pickup our environment after all. Excellent

Many thanks for the help.

One comment here is it would be nice if the environment script at the swan configuration step supported passing of arguments to it, that way we wont need to have a separate script for each of our releases, just the same script that we pass the release number to.

Great that it works! Could you share here the recipe you followed in the end (what you installed in your stack, what you do in the user env script)?

Regarding your comment, I guess that issue will be solved when we provide your stack in the list of available stacks for SWAN? As I understand it, each release will have it’s own stack, path and setup script on CVMFS and the user will just select the stack they want from a drop-down when starting the SWAN session? Or will you need some further parameterized setup via a user environment script? Even in the latter case, your software stack could define some variables that then the user environment script could rely on (instead of passing parameters).