other people have also tried large files in the past and jmeter definitely
has a limit. the current implementation simply passes the inputstream from
the file to the output stream to http connection.

beyond that, unless a server is hosted at a tier 1 or tier 2 ISP, chances
are the connection won't be able to handle more than a dozen concurrent
uploads.  the best option is to use multiple clients to send large uploads.
though keep in mind a 100mbit ethernet can only handle 9-10Mb/second.

10mbit of bandwidth can only handle 1Mbyte/second. Many 2nd and 3rd tier ISP
only give account 10mbit of bandwidth. Getting a full 100mbit bandwidth
generally only happens at tier 1 ISP like Level3, MCI, Global Crossing, etc.
hope that helps.  I cover this stuff in my articles on jmeter's articles
page.

peter lin


On 11/28/05, Will Meyer <[EMAIL PROTECTED]> wrote:
>
> Hi.  I have a test that, among many other things, uploads a huge file via
> a
> POST.  I have all of the listeners disabled, and am running the test in
> command-line mode.
>
> Eventually, with high enough numbers of loops and threads, I run out of
> memory.  I have set the max heap quite high.  Does anyone know what the
> memory implications in JMeter are of having a big (1M) file upload in the
> post?  Does it keep the data in memory for some period of time?
>
> Related, is it correct to assume that, with no listeners, the memory
> consumption of the process will not grow beyond that required (all else
> being equal) to hold tests for all the threads at one time, and that there
> is no reason it should grow beyond that point?  In other words, anything
> that would cause memory to grow over time that I could take a look at?  I
> would have thought that once I got to a steady-state, memory would spie up
> and down as the tests are run and the GC runs, but the overall trend
> should
> be flat.
>
> Thanks,
>
> Will
>
>

Reply via email to