John McNally wrote:
>
> "Geir Magnusson Jr." wrote:
>
> >
> > From: John McNally [mailto:[EMAIL PROTECTED]]
> >
> > > This patch fixes a memory leak in texen, where the repeated use of the
> > > context without caching templates leads to introspection cache growth.
> >
> > To be fair, it wasn't a memory leak, but rather a combination of factors
> > that caused new introspection information
> > to be repeatedly acquired and cached.
> >
>
> I was probably using memory leak in an imprecise manner as I think
> java's garbage collection is supposed to eliminate the possibility. I
> just meant it was possible to create a situation where what should
> amount to a couple megabytes of valid data at any time could be
> accumulated to create an out of memory error. To be really fair it
> could be said that an application using texen should be sure to set
> template caching on. A better fix would probably check the caching
> property and provide the dummy proof Context wrapper if it is off, but
> use the context directly if on, so that performance can be maximized.
It was a great catch on your part. I know you spent some serious time
finding it, and I'm sorry.
I put some notes on this in the dev doc.
Another solution would be to just reuse the same Template objects in the
loop (you could store in a Map or something), and then the whole issue
goes away as well.
It's cool that we have so many ways to deal with this :
> But I was very happy to see that what seemed like a very hard problem
> turned out to have a one line fix. :)
:)
geir
--
Geir Magnusson Jr. [EMAIL PROTECTED]
Developing for the web? See http://jakarta.apache.org/velocity/