You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello! Just a quick question: is there any way to send variables from JupyterHub API request like POST /users/{name}/servers/{server_name} to the container. So for example if I set some kind of key in JSON body like { "key1":"value1"} would it be possible to use this variable inside the container at notebool?
The text was updated successfully, but these errors were encountered:
The short answer to this is no, for various reasons:
AFAIK, environment variables can only be populated at a process start, and there isn't a way to change/add to them dynamically into a running kernel.
Potentially variables could be populated in new kernels, but I also don't think there is any existing way to pass any environment variables that weren't available on the JupyterLab instance start.
Hey @michalc. Basically, the case is to put some identifier to container to retrieve the data from the db. I thought about using some pre-start scripts with using this id to query the DB and fill the dataframe. Thanks!
You can override the get_env() from the Spawner that runs in the hub just before the task is created, and then notebooks would I think have access to the environment variables. Not sure if you wanted something a bit more dynamic/done after the single user server startup...
Hello! Just a quick question: is there any way to send variables from JupyterHub API request like
POST /users/{name}/servers/{server_name}
to the container. So for example if I set some kind of key in JSON body like { "key1":"value1"} would it be possible to use this variable inside the container at notebool?The text was updated successfully, but these errors were encountered: