Not sure if this will help, but you can launch Jupyter with `ipython 
notebook` at the command line (then select Julia kernels). If you make sure 
to `File -> Close and Halt` your notebooks (instead of just closing the 
tabs), they won't take up memory (it kills the process). 

As far as I know, Ctrl-C Ctrl-C is still the official way of closing 
Jupyter, but I don't think I've ever had an issue where Jupyter itself took 
up too much memory. But I did have instances (with lots of plots) where 
Firefox held on to my RAM after closing the notebooks. Which process takes 
up memory in your case?

On Tuesday, November 3, 2015 at 5:28:57 AM UTC-5, Sisyphuss wrote:
>
> I encountered similar problem on my Ubuntu.
>
>
> On Tuesday, November 3, 2015 at 11:24:02 AM UTC+1, Ferran Mazzanti wrote:
>>
>> Dear all,
>>
>> I use IJulia from time to time. To do that I open the julia REPL in the 
>> terminal, issue 
>>
>> using IJulia
>> notebook()
>>
>> ..and the web browser opens and I am ready to go. But when I want to 
>> finish  the IJulia session,
>> I dont see a 'close' button in the file manager of Jupyter, so I have to 
>> close the tab (or browser) where I'm
>> working. Then the REPL gets stuck, so I have to do Ctrl+C and quit form 
>> julia.
>> A very unconfortable way of proceeding... and it leaves the memory of my 
>> Linux Mint box full of garbage,
>> to the point that it can swallow 12+Gb of my RAM.
>>
>> Now I'm sure there are better ways to proceed... but how? 
>>
>> Best regards and thanks,
>>
>> Ferran.
>>
>

Reply via email to