-
Notifications
You must be signed in to change notification settings - Fork 287
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a way to run a command before starting the kernel itself #127
Comments
a2km provides the third option, which I think is the best, by creating a script that wraps env activation and launching the kernel. The most important aspect of that particular implementation is the use of |
:-/ Also: ruby :-/ So the third option is the way to go? That would mean that https://github.com/Cadair/jupyter_environment_kernels (a different solution to the "one kernel per env" problem, adds a kernel on the fly without writing a kernel spec) would need to create a batch/bash script for each kernel and safe that somewhere :-/ bash could exec, windows would create a The "add a 'before startup command' + run it and read the env + set the env for the kernel" option is not acceptable, isn't it? |
@JanSchulz If that project used |
That project doesn't do anything to the launch sequence, only adds the environment entries when asked about the installed kernels (by extending the What would be needed is a way to tell launch_kernel to use There also seems to be no way to add to the kw args from the kernel spec, as start_kernel uses the arguments from the start_kernel call to set these :-( One way to solve this is simple scan the kernel start line in |
I don't think adding any way to use |
I guess that the env thingy is also not acceptable... :-) Ok, thanks for the advise here, will take it back to https://github.com/Cadair/jupyter_environment_kernels and there we will probably need to generate sh/cmd files :-/ |
We allow declaratively setting extra environment variables, but not modifying the existing content. Maybe we could special case adding directories to $PATH with a list in the kernelspec. But there are several other path-like things that you might want to modify similarly, and I don't see how we could cover them all. |
That would help (e.g. env could be set to |
If you're starting a Python kernel, you could of course just write a Python script which sets environment variables and then calls |
Unfortunately, this would then need to be installed in every environment, so that it is started with the right python :-( |
This is now solved in https://github.com/Cadair/jupyter_environment_kernels: it has a proxy dict implementation which is used as the env dict which loads the real env values of a activated env when first accessed. |
That's cool ! |
[From the gitter chat at Febr. 16 22:46]
Usecase: an environment needs to be activated to get the path set correctly. This can in some cases be tricky, as e.g. conda lets you add batch files which will be run on activate and can manipulate the path (that's at least my understanding and I would like to try that for a miktex package, so that miktex is in path when the env is activated).
So in this case, it would be nice if the actual kernel command which is run would be
activate env & python ....
. This can be achieved by two ways:shell=True
to the Popen call which starts the kernel (according to https://stackoverflow.com/questions/17742789/running-multiple-bash-commands-with-subprocess)Both need changes in jupyther-client: either a way to get the
shell=True
parameter into the Popen call or the handling of env export (and a flag in the kernel spec to trigger it).A third option is using batch files as a kernel startup command in the kernel spec. The batch would first activate the env and then call the python with the kernel startup line and the command line parameter as given by the batch call. Unfortunately, this leads to #104 and it could be interesting how to handle it because this needs a writable place where the batch file is created :-/
I would like to get advise what would be the best way to go forward here :-)
The text was updated successfully, but these errors were encountered: