Re: Avoiding OutOfMemory Errors by limiting data in pipeline

2008-05-12 Thread Peter Hunsberger
On Sun, May 11, 2008 at 6:59 PM, Joerg Heinicke [EMAIL PROTECTED] wrote: On 09.05.2008 09:41, Peter Hunsberger wrote: I haven't looked at the code here, but couldn't you just introduce a second getOutputStream( int bufferSize ) method where the current interface method continues with

Re: Avoiding OutOfMemory Errors by limiting data in pipeline

2008-05-11 Thread Joerg Heinicke
On 09.05.2008 09:41, Peter Hunsberger wrote: I think this is rather hard to do. The place where we instantiate the BufferedOutputStreams (both java.io and o.a.c.util) is AbstractEnvironment.getOutputStream(int bufferSize). So in order to pass a second buffer size argument to the

Re: Avoiding OutOfMemory Errors by limiting data in pipeline

2008-05-09 Thread Peter Hunsberger
On Fri, May 9, 2008 at 12:08 AM, Joerg Heinicke [EMAIL PROTECTED] wrote: On 08.05.2008 11:53, Bruce Atherton wrote: snip/ I think this is rather hard to do. The place where we instantiate the BufferedOutputStreams (both java.io and o.a.c.util) is AbstractEnvironment.getOutputStream(int

Re: Avoiding OutOfMemory Errors by limiting data in pipeline

2008-05-08 Thread Bruce Atherton
My only comment is that I think it would be good to allow the initial buffer size to be configurable. If you know the bulk of your responses are greater than 32K, then performing the ramp-up from 8K every time would be a waste of resources. For another web site, if most responses were smaller

Re: Avoiding OutOfMemory Errors by limiting data in pipeline

2008-05-08 Thread Antonio Gallardo
Hi Joerg, I am +1. One question, what are supposed to be the default values for both parameters? Best Regards, Antonio Gallardo. Joerg Heinicke escribió: On 27.04.2008 23:43, Joerg Heinicke wrote: 2. Does the full amount of the buffer automatically get allocated for each request, or

Re: Avoiding OutOfMemory Errors by limiting data in pipeline

2008-05-08 Thread Joerg Heinicke
On 08.05.2008 11:53, Bruce Atherton wrote: My only comment is that I think it would be good to allow the initial buffer size to be configurable. If you know the bulk of your responses are greater than 32K, then performing the ramp-up from 8K every time would be a waste of resources. For

Re: Avoiding OutOfMemory Errors by limiting data in pipeline

2008-05-08 Thread Joerg Heinicke
On 08.05.2008 12:16, Antonio Gallardo wrote: One question, what are supposed to be the default values for both parameters? For the initial buffer size I thought of 8K, maybe 16K. It should be a reasonable size that's not overly large (i.e. unnecessarily reserved memory) for most of the

Re: Avoiding OutOfMemory Errors by limiting data in pipeline

2008-05-07 Thread Joerg Heinicke
On 27.04.2008 23:43, Joerg Heinicke wrote: 2. Does the full amount of the buffer automatically get allocated for each request, or does it grow gradually based on the xml stream size? I have a lot of steps in the pipeline, so I am worried about the impact of creating too many buffers even if

Re: Avoiding OutOfMemory Errors by limiting data in pipeline

2008-04-27 Thread Joerg Heinicke
On 24.04.2008 16:08, Bruce Atherton wrote: Thanks for the response. About setting the buffer size, this looks like it could be what I am looking for. A few questions: 1. Do I have to set the buffer size on each transformer and the serializer as well as the generator? What about setting it on