Hello.

I'm working on an application that uses WebKit as a "rendering farm". A
running program has a shared memory and an WebKit instance using the Gtk
+ port.

I've modified WebKit and added:
 - support for off-screen rendering, meaning instead of showing pages to
the screen I render them onto a cairo surface using the patches from the
Clutter project,
 - a function that simulates a click on a page with given X,Y
parameters.
 - a new signal to the Gtk+ port that triggers whenever a visible change
occurred on the page (opening html, JavaScript modifications)

The HTML page has a DIV tag that is sensitive to mouse clicks. A click
on that DIV tag will cause the page to modify.

Now I have a client program that connects to the WebKit program using
IPC. Then I load the mentioned HTML page and sends click requests every
0.25 seconds with the coordinates where the DIV tag is located.

What I've notices is, that after a while, memory of the WebKit program
increases. The program was running for about one hour and the process
memory increased for about 7 MB.

Now I'm asking if the memory increment is a result of WebKit caching
subsystem or could that be a leak? Keep in mind that I click/refresh the
same page over and over again.

Greets,
Luka

_______________________________________________
webkit-dev mailing list
[email protected]
http://lists.webkit.org/mailman/listinfo.cgi/webkit-dev

Reply via email to