I haven't looked into the details of your configuration, but in
general, Jupyter and the python kernels have no problem using whatever
RAM the system makes available to it. I have created single NumPY
arrays with 1TB of RAM with no problem. If you are running out of RAM,
it is a system level thing
On 20 November 2017 at 18:58, Karthik Ram wrote:
> Argg. Thank you Thomas. It did run longer(17 min as opposed to 10 min)
> this time after I un-commented those lines, but still saw the same issue.
> Is there any limitation in Jupyter that it cannot handle more than certain
On 19 November 2017 at 22:28, Karthik Ram wrote:
> I also changed the following in jupyterhub_config.py file. But still
> seeing the issue.
>
> ...
>
#c.Spawner.mem_guarantee = 8G
>
>
You'll need to uncomment these lines for them to affect anything, i.e.
remove the # from
I also changed the following in jupyterhub_config.py file. But still seeing
the issue.
## Minimum number of bytes a single-user notebook server is guaranteed to
have
# available.
#
# Allows the following suffixes:
#- K -> Kilobytes
#- M -> Megabytes
#- G -> Gigabytes
#- T ->