I've deployed a Jupyterhub service on an EC2 instance (running as a system 
service) that uses dockerspawner.SystemUserSpawner to launch a 
jupyter/docker-stacks minimal-notebook environment for a user. I've 
installed and activated nb_conda, and nb_conda_kernels, (trying both) on 
both the root conda environment running Jupyterhub (non containerized), and 
the user containers by building a docker image using the Jupyter 
docker-stack as the base.

When using a customer docker image built with nb_conda/nb_conda_kernels 
installed it will show the "functionality" within Jupyterlab, [root] 
Python, Python [default], etc. (I can't remember the exact layout but it 
shows that it is "enabled" and working), but none of the actual system 
users kernels will be shown. I've searched high and low about this, and 
found multiple issues on Github that seem related, but haven't resolved the 
issue for me, so I figured I would ask the community directly.

I should mention launching `jupyter notebook` manually from my home 
directory results in proper functionality and shows my 5+ conda kernels, so 
it's somewhere in the Jupyterhub stack that it's losing the configuration 
of user specific kernels.

I appreciate any help I can get with this, as I've hit my head on my desk 
multiple times at this point!

Thanks,
Jorden

-- 
You received this message because you are subscribed to the Google Groups 
"Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jupyter/6c9e4f67-9f84-478e-a051-c6cfa6b72fcf%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to