[ 
https://issues.apache.org/jira/browse/VELOCITY-570?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jacob Mundt updated VELOCITY-570:
---------------------------------

    Attachment: expandbuff-speedup-reinit-fix.patch

The original patch doesn't reset the nextBufExpand increment when ReInit is 
called--so even after ReInit is called with a tiny buffer size, the 
nextBufExpand increment could still be very large--my simple test loop (parsing 
a 3MB template repeatedly) actually ran out of memory after a few iterations.

Attached a patch which fixes this.  I also changed the initial increment to be 
based on the buffer size, rather than always using 2048.

> speed improvement of the tokenizer
> ----------------------------------
>
>                 Key: VELOCITY-570
>                 URL: https://issues.apache.org/jira/browse/VELOCITY-570
>             Project: Velocity
>          Issue Type: Bug
>          Components: Engine
>    Affects Versions: 1.4, 1.5, 1.6
>         Environment: Tested on FreeBSD 6.2-STABLE and Linux (Debian Etch) on 
> i386.
> Java: JDK 1.5
>            Reporter: Ronald Klop
>             Fix For: 1.6
>
>         Attachments: expandbuff-speedup-reinit-fix.patch, 
> expandbuff-speedup.patch
>
>
> On some large templates (1-4MB) velocity gets very slow. I used JProfiler and 
> found a lot of time is spent in VelocityCharStream.ExpandBuff. It is doing a 
> lot of System.arraycopy.
> The problem is that the size of the buffer is increased linearly in stead of 
> exponentialy.
> I have made a patch which doubles the size of the buffer in stead of 
> incrementing it with the same value.
> In my tests and in JProfiler it is shown that a lot less time is spent in 
> ExpandBuff.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to