Hi, 
        We know user can create as many notebook instances as they can in 
one jupyter notebook server client, and when we want to integrate spark's 
pyspark into jupyter notebook, which means using the ipykernel, when every 
each notebook instance is created, then a pyspark shell(or to say a driver) 
is initialized, since we run that in spark client mode, so all the started 
drivers would run on same host. And when some user in crazy mode, like 
creating many many notebook instances, then many drivers would all start in 
one host, which will lead to being lack of available resource  easily. 
       So I am wondering if Jupyter notebook has some mechanism to deal 
with or avoid such kind of that issue?

Many thanks in advance!

Best Regards
Sherry

-- 
You received this message because you are subscribed to the Google Groups 
"Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jupyter/3fa00644-1ea0-419d-8bbe-3dfc0d67fe95%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to