Am 09.01.2016 um 22:26 schrieb Felix Schumacher:
Am 09.01.2016 um 22:15 schrieb Felix Schumacher:
Am 08.01.2016 um 22:23 schrieb Ahmad Alnafoosi:
Hi
I have a jmeter performance multi-stage test using HTTP REST API.
The test has 10 concurrent users
The test fails on Object Upload (PUT) that are 100MB (and larger)
in size with Out of Memory Exception.
Which sampler did you use and what was the configuration you used in it?
Is your REST API sending back large amounts of data?
I have tested a simple upload page with a 400 MB file and 100
concurrent threads. The memory usage did not pass 250 MB.
I used "HTTP Request" sampler with implementation "HttpClient4",
method "POST" and added the file in the panel "Send File With the
Request:".
Oops, I just read, that you are using PUT. Will have to test again.
Your observation is right. The PUT (PATCH, DELETE and WebDAV methods)
data, will get stored in memory. POST has a mechanism to ignore the
contents.
If you can compile and patch jmeter, you might want to try the attached
path, which will store only the first 100 bytes of the sent files. (The
patch should probably better use the mechanism from POST, but that would
have been more work for me, for now.)
You can create a bugzilla entry if you like.
Regards,
Felix
Felix
Regards,
Felix
I did some research on Jmeter Memory optimization. I did not find
anything on the issue that I am dealing with specifically.
So I followed optimization recommendations and did the following:
1- Removed all listeners from the test
2- Running from non gui command line
3- Saving JTL as CSV
4- doing all the graph and summery as post process outside the test.
5- Experimented and Increase heap up to 20 GB as follows
HEAP="-Xms20g -Xmx20g" (this allowed the 100MB to pass but still
failed at 1GB and 5GB file sizes).
6- I added HTTP Cache manager and enabled (Clear Cache on Each
Iteration) and Limited (Max Number of elements in cache to 3).
All of the above did not help in getting 10 concurrent users to PUT
5GB.!!
Is there a way to do 10 users uploading 5GB concurrently with the
default 512MB?
Does Jmeter cache in its heap all of the objects that it uploads?
What is the optimum heap size that is recommended for the above
scenario??
your help is appreciated.
thanks
Ahmad
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]
diff --git a/src/protocol/http/org/apache/jmeter/protocol/http/sampler/HTTPHC4Impl.java b/src/protocol/http/org/apache/jmeter/protocol/http/sampler/HTTPHC4Impl.java
index 7b44a28..c8ef96b 100644
--- a/src/protocol/http/org/apache/jmeter/protocol/http/sampler/HTTPHC4Impl.java
+++ b/src/protocol/http/org/apache/jmeter/protocol/http/sampler/HTTPHC4Impl.java
@@ -1336,8 +1336,21 @@ public class HTTPHC4Impl extends HTTPHCAbstractImpl {
// our own stream, so we can return it
final HttpEntity entityEntry = entity.getEntity();
if(entityEntry.isRepeatable()) {
- ByteArrayOutputStream bos = new ByteArrayOutputStream();
- entityEntry.writeTo(bos);
+ final ByteArrayOutputStream bos = new ByteArrayOutputStream();
+ entityEntry.writeTo(new OutputStream() {
+ int counter = 0;
+
+ @Override
+ public void write(int b) throws IOException {
+ if (counter < 100) {
+ bos.write(b);
+ }
+ if (counter == 100) {
+ bos.write("...\n".getBytes());
+ }
+ counter++;
+ }
+ });
bos.flush();
// We get the posted bytes using the charset that was used to create them
entityBody.append(new String(bos.toByteArray(), charset));
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]