> Anyway, I'll submit an issue to the jupyter/notebook repo later today
although I wish I could pin down better where exactly the leak is coming
from.

the updating mechanism in Interact doesn't use jupyter a whole lot. It
happens in these lines of code
<https://github.com/JuliaLang/Interact.jl/blob/17f8864b1ad65f3bdef532254fe06d86f60974ec/src/IJulia/ijulia.js#L28-L32>

the jQuery empty <http://api.jquery.com/empty/> doc says "To avoid memory
leaks, jQuery removes other constructs such as data and event handlers from
the child elements before removing the elements themselves." - but it seems
something bad is still going on. I'll look into this.



On Fri, Nov 20, 2015 at 11:18 PM, Shashi Gowda <[email protected]>
wrote:

> If you install Patchwork.jl <http://github.com/shashi/Patchwork.jl>, and
> re-run your notebook, it should fix this issue. (you might also need to
> pre-compile Compose again - can be done by removing the .ji file in
> ~/.julia/lib/v0.4/)
>
> Compose (which Gadfly uses for rendering to SVG) doesn't depend on
> Patchwork, but if you have it installed, it will use the Patchwork backend
> when you render a @manipulate of plots - Patchwork will then try to
> reconcile previously rendered DOM nodes so that there are no performance
> penalties of this sort.
>
> On Fri, Nov 20, 2015 at 10:48 PM, Andrew Keller <
> [email protected]> wrote:
>
>> I think this is exactly what is happening. Some findings:
>>
>> 1) run the code:
>>
>> using Interact, Gadfly
>>> @manipulate for φ=0:π/16:4π, f=[:sin => sin, :cos => cos]
>>>     plot((θ)->f(θ+φ),0,25)
>>> end
>>
>>
>> 2) Chrome dev tools--> Profiles--> heap snapshot.
>> 3) Click sin, cos, sin, cos, sin, cos, ... sin in the notebook
>> 4) Take another heap snapshot and look in comparison view.
>>
>> It looks like among other things there are a lot of SVGPathElements and
>> SVGTextElements that belong to detached DOM trees, suggesting the old plots
>> never get properly disposed. If I instead capture a JS profile in the
>> Chrome dev tools-->Timeline panel, it appears like the number of nodes and
>> listeners increases without bound.
>>
>> Now suppose I use Winston instead of Gadfly. The memory still appears to
>> leak, although the plots are a little more lightweight and the leak is
>> slower.
>>
>> Anyway, I'll submit an issue to the jupyter/notebook repo later today
>> although I wish I could pin down better where exactly the leak is coming
>> from.
>>
>> On Thursday, November 19, 2015 at 10:35:31 AM UTC-8, Keno Fischer wrote:
>>>
>>> Sounds like the memory leak is on the browser side? Maybe something is
>>> keeping a javascript reference to the plot? Potentially a Jupyter/IJulia
>>> bug?
>>>
>>> On Thu, Nov 19, 2015 at 12:01 PM, Stefan Karpinski <[email protected]
>>> > wrote:
>>>
>>>> This should work – if there's a memory leak that's never reclaimed by
>>>> gc, that's a bug.
>>>>
>>>> On Thu, Nov 19, 2015 at 11:55 AM, Andrew Keller <[email protected]>
>>>> wrote:
>>>>
>>>>> Maybe generating a new plot every time is not great practice, on
>>>>> account of the performance hit. That being said, I think it's perfectly
>>>>> legitimate to do what I'm doing for prototyping purposes. I can achieve 
>>>>> the
>>>>> frame rate I want and the main example shown on
>>>>> https://github.com/JuliaLang/Interact.jl does the same thing I do,
>>>>> generating a new plot each time.
>>>>>
>>>>> In fact, I'd encourage anyone reading this to just try that example,
>>>>> and repeatedly click between sin and cos. I'm able to make the memory
>>>>> consumption of my browser grow without bound. Surely someone besides 
>>>>> myself
>>>>> has noticed this before! I don't think loading another package is a 
>>>>> serious
>>>>> solution to the problem I'm describing, although your package certainly
>>>>> looks useful for other purposes.
>>>>>
>>>>> Just to reiterate, this is not a small memory leak; this is like a
>>>>> memory dam breach. I'm happy to help debug this but some assistance would
>>>>> be appreciated.
>>>>>
>>>>>
>>>>> On Thursday, November 19, 2015 at 7:11:55 AM UTC-8, Tom Breloff wrote:
>>>>>>
>>>>>> You're creating a new Gadfly.Plot object every update, which is a bad
>>>>>> idea even if Gadfly's memory management was perfect. Plots.jl gives you 
>>>>>> the
>>>>>> ability to add to or replace the underlying data like this:
>>>>>>
>>>>>> using Plots
>>>>>> gadfly()
>>>>>> getxy() = (1:10, rand(10))
>>>>>> plt = plot(x, y)
>>>>>>
>>>>>> # overwrite underlying plot data without building a new plot
>>>>>> plt[1] = getxy()
>>>>>>
>>>>>>
>>>>>> You can also use familiar push! and append! calls.
>>>>>>
>>>>>> Let me know if this helps, and please post issues if you find bugs.
>>>>>> Of course the memory issue could be while redisplaying in IJulia, in 
>>>>>> which
>>>>>> case this method won't help.
>>>>>>
>>>>>> On Thursday, November 19, 2015, Andrew Keller <[email protected]>
>>>>>> wrote:
>>>>>>
>>>>>>> I'd like to use Interact to have a plot that updates frequently in a
>>>>>>> Jupyter notebook, but it seems like there is a large memory leak 
>>>>>>> somewhere
>>>>>>> and I am having some trouble tracking down what package is responsible.
>>>>>>> Within a few minutes of running, the following code will cause the 
>>>>>>> memory
>>>>>>> used by the web browser to balloon to well over 1 GB with no sign of
>>>>>>> slowing down. It is almost like the memory allocated for displaying a
>>>>>>> particular plot is never deallocated:
>>>>>>>
>>>>>>> using Reactive, Interact, Gadfly
>>>>>>>
>>>>>>> @manipulate for
>>>>>>>>     paused=false,
>>>>>>>>     dt = fpswhen(lift(!, paused), 10)
>>>>>>>>     plot(x=collect(1:10),y=rand(10))
>>>>>>>> end
>>>>>>>
>>>>>>>
>>>>>>> I can observe this problem using Julia 0.4.1, together with the most
>>>>>>> recent releases of all relevant packages, in either Safari on OS X or
>>>>>>> Chrome on Windows 10.
>>>>>>>
>>>>>>> Here's hoping someone has an idea of what's going on or advice for
>>>>>>> how to track down this problem. It seems like something that many others
>>>>>>> should be experiencing.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Andrew
>>>>>>>
>>>>>>
>>>>
>>>
>

Reply via email to