Re: JMeter 2.5 performance
Thank you sebb! Updating the new code from svn and looks everything is back to normal. Especially by using the httpclient4 in httpsampler 2011/9/2 sebb > On 2 September 2011 16:44, Vance Zhao wrote: > > i'm aslo hitting the slow result when using the http sampler with > > httpclient4. i found it easily eat out all the resouce with httpclient4 > and > > actually i got OOM in the jmeter client side. Do we forget closing the > > connction in jmeter? > > I already wrote (in this thread) that there is a bug in the > HttpClient4 implementation - it does not re-use connections when it > should. > > This has been fixed in SVN and will be in the next release, whenever that > is. > > > On Aug 31, 2011 11:44 PM, "Robin D. Wilson" wrote: > >> Sebb asked: > > >How do the average elapsed times compare? > > The average times are about 50% longer than with 2.4 - so where I was > averaging ~1000ms for the 'submit' (POST) action, it is now showing > about > 1500 to 1750ms. (I am mostly concerned with the 'POST' because that's > where all the real work occurs on our server side.) > >>> > >>>Are all the methods equally affected? > >> > >> Hard to say, I wasn't recording them in the benchmark data I keep - but > in > >> just looking at it, I would say yes. > >> > >Also what about min/max std.dev? > > Those varied a lot from one run to the next anyway - just because I've > always been dealing with very short durations - but I did notice that > the 'max' seemed to consistently read higher than my prior benchmarks > - but only by 1-2 seconds (1000-2000ms). Given the variables in the > system - that doesn't seem too outrageous to me. > > >The http sampler code was re-organised for 2.5; additional classes > >were > added and there is another code layer, > >but I'd be surprised if that had a significant effect. > > I wonder if that mattered - it sure appears to be making a big > difference. I think I could re-work the script to use AJP (we're > hitting 'tomcat' through an apache front-end right now - and our > apache uses AJP), but I have no prior benchmarks to compare AJP > with... > >>> > >>>Not sure how that would help. > >> > >> If it were something in the HTTP Request HTTP Client sampler, it might > > make > >> a difference if I switched to a different sampler. If it is a problem > that > >> is global to all JMeter samplers, then it won't help at all. Just trying > > to > >> isolate the problem area. > >> > >>>I find it hard to believe that the additional code used by the samplers > - > > I > >> think it's just > >>>one extra level of indirection - could cause the significantly different > >> results you are seeing. > >>> > >>>Does the jmeter.log file show anything relevant? > >> > >> Not that I can see. Everything appears normal to me. > >> > >> -- > >> Robin D. Wilson > >> Sr. Director of Web Development > >> KingsIsle Entertainment, Inc. > >> VOICE: 512-777-1861 > >> www.KingsIsle.com > >> > >> > >> - > >> To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org > >> For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org > >> > > > > - > To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org > For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org > >
Re: JMeter 2.5 performance
On 2 September 2011 16:44, Vance Zhao wrote: > i'm aslo hitting the slow result when using the http sampler with > httpclient4. i found it easily eat out all the resouce with httpclient4 and > actually i got OOM in the jmeter client side. Do we forget closing the > connction in jmeter? I already wrote (in this thread) that there is a bug in the HttpClient4 implementation - it does not re-use connections when it should. This has been fixed in SVN and will be in the next release, whenever that is. > On Aug 31, 2011 11:44 PM, "Robin D. Wilson" wrote: >> Sebb asked: >How do the average elapsed times compare? The average times are about 50% longer than with 2.4 - so where I was averaging ~1000ms for the 'submit' (POST) action, it is now showing about 1500 to 1750ms. (I am mostly concerned with the 'POST' because that's where all the real work occurs on our server side.) >>> >>>Are all the methods equally affected? >> >> Hard to say, I wasn't recording them in the benchmark data I keep - but in >> just looking at it, I would say yes. >> >Also what about min/max std.dev? Those varied a lot from one run to the next anyway - just because I've always been dealing with very short durations - but I did notice that the 'max' seemed to consistently read higher than my prior benchmarks - but only by 1-2 seconds (1000-2000ms). Given the variables in the system - that doesn't seem too outrageous to me. >The http sampler code was re-organised for 2.5; additional classes >were added and there is another code layer, >but I'd be surprised if that had a significant effect. I wonder if that mattered - it sure appears to be making a big difference. I think I could re-work the script to use AJP (we're hitting 'tomcat' through an apache front-end right now - and our apache uses AJP), but I have no prior benchmarks to compare AJP with... >>> >>>Not sure how that would help. >> >> If it were something in the HTTP Request HTTP Client sampler, it might > make >> a difference if I switched to a different sampler. If it is a problem that >> is global to all JMeter samplers, then it won't help at all. Just trying > to >> isolate the problem area. >> >>>I find it hard to believe that the additional code used by the samplers - > I >> think it's just >>>one extra level of indirection - could cause the significantly different >> results you are seeing. >>> >>>Does the jmeter.log file show anything relevant? >> >> Not that I can see. Everything appears normal to me. >> >> -- >> Robin D. Wilson >> Sr. Director of Web Development >> KingsIsle Entertainment, Inc. >> VOICE: 512-777-1861 >> www.KingsIsle.com >> >> >> - >> To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org >> For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org >> > - To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org
RE: JMeter 2.5 performance
i'm aslo hitting the slow result when using the http sampler with httpclient4. i found it easily eat out all the resouce with httpclient4 and actually i got OOM in the jmeter client side. Do we forget closing the connction in jmeter? On Aug 31, 2011 11:44 PM, "Robin D. Wilson" wrote: > Sebb asked: >>> How do the average elapsed times compare? >>> >>> The average times are about 50% longer than with 2.4 - so where I was >>> averaging ~1000ms for the 'submit' (POST) action, it is now showing >>> about >>> 1500 to 1750ms. (I am mostly concerned with the 'POST' because that's >>> where all the real work occurs on our server side.) >> >>Are all the methods equally affected? > > Hard to say, I wasn't recording them in the benchmark data I keep - but in > just looking at it, I would say yes. > Also what about min/max std.dev? >>> >>> Those varied a lot from one run to the next anyway - just because I've >>> always been dealing with very short durations - but I did notice that >>> the 'max' seemed to consistently read higher than my prior benchmarks >>> - but only by 1-2 seconds (1000-2000ms). Given the variables in the >>> system - that doesn't seem too outrageous to me. >>> The http sampler code was re-organised for 2.5; additional classes were >>> added and there is another code layer, but I'd be surprised if that had a significant effect. >>> >>> I wonder if that mattered - it sure appears to be making a big >>> difference. I think I could re-work the script to use AJP (we're >>> hitting 'tomcat' through an apache front-end right now - and our >>> apache uses AJP), but I have no prior benchmarks to compare AJP with... >> >>Not sure how that would help. > > If it were something in the HTTP Request HTTP Client sampler, it might make > a difference if I switched to a different sampler. If it is a problem that > is global to all JMeter samplers, then it won't help at all. Just trying to > isolate the problem area. > >>I find it hard to believe that the additional code used by the samplers - I > think it's just >>one extra level of indirection - could cause the significantly different > results you are seeing. >> >>Does the jmeter.log file show anything relevant? > > Not that I can see. Everything appears normal to me. > > -- > Robin D. Wilson > Sr. Director of Web Development > KingsIsle Entertainment, Inc. > VOICE: 512-777-1861 > www.KingsIsle.com > > > - > To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org > For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org >
RE: JMeter 2.5 performance
Sebb asked: >> >>>How do the average elapsed times compare? >> >> The average times are about 50% longer than with 2.4 - so where I was >> averaging ~1000ms for the 'submit' (POST) action, it is now showing >> about >> 1500 to 1750ms. (I am mostly concerned with the 'POST' because that's >> where all the real work occurs on our server side.) > >Are all the methods equally affected? Hard to say, I wasn't recording them in the benchmark data I keep - but in just looking at it, I would say yes. >>>Also what about min/max std.dev? >> >> Those varied a lot from one run to the next anyway - just because I've >> always been dealing with very short durations - but I did notice that >> the 'max' seemed to consistently read higher than my prior benchmarks >> - but only by 1-2 seconds (1000-2000ms). Given the variables in the >> system - that doesn't seem too outrageous to me. >> >>>The http sampler code was re-organised for 2.5; additional classes >>>were >> added and there is another code layer, >>>but I'd be surprised if that had a significant effect. >> >> I wonder if that mattered - it sure appears to be making a big >> difference. I think I could re-work the script to use AJP (we're >> hitting 'tomcat' through an apache front-end right now - and our >> apache uses AJP), but I have no prior benchmarks to compare AJP with... > >Not sure how that would help. If it were something in the HTTP Request HTTP Client sampler, it might make a difference if I switched to a different sampler. If it is a problem that is global to all JMeter samplers, then it won't help at all. Just trying to isolate the problem area. >I find it hard to believe that the additional code used by the samplers - I think it's just >one extra level of indirection - could cause the significantly different results you are seeing. > >Does the jmeter.log file show anything relevant? Not that I can see. Everything appears normal to me. -- Robin D. Wilson Sr. Director of Web Development KingsIsle Entertainment, Inc. VOICE: 512-777-1861 www.KingsIsle.com - To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org
Re: JMeter 2.5 performance
On 30 August 2011 21:03, Robin D. Wilson wrote: > > Sebb asked: > >>How do the average elapsed times compare? > > The average times are about 50% longer than with 2.4 - so where I was > averaging ~1000ms for the 'submit' (POST) action, it is now showing about > 1500 to 1750ms. (I am mostly concerned with the 'POST' because that's where > all the real work occurs on our server side.) Are all the methods equally affected? >>Also what about min/max std.dev? > > Those varied a lot from one run to the next anyway - just because I've > always been dealing with very short durations - but I did notice that the > 'max' seemed to consistently read higher than my prior benchmarks - but only > by 1-2 seconds (1000-2000ms). Given the variables in the system - that > doesn't seem too outrageous to me. > >>The http sampler code was re-organised for 2.5; additional classes were > added and there is another code layer, >>but I'd be surprised if that had a significant effect. > > I wonder if that mattered - it sure appears to be making a big difference. I > think I could re-work the script to use AJP (we're hitting 'tomcat' through > an apache front-end right now - and our apache uses AJP), but I have no > prior benchmarks to compare AJP with... Not sure how that would help. I find it hard to believe that the additional code used by the samplers - I think it's just one extra level of indirection - could cause the significantly different results you are seeing. Does the jmeter.log file show anything relevant? > -- > Robin D. Wilson > Sr. Director of Web Development > KingsIsle Entertainment, Inc. > www.KingsIsle.com > > On 30 August 2011 20:19, Robin D. Wilson wrote: >> I have a JMeter script that performs registrations on our website. On >> JMeter 2.4, I am able to run this script with a throughput of ~160 >> requests per second. With no other differences in the script, I run it >> on JMeter 2.5 and only get about 78 requests per second (peak). >> >> I checked and modified configs for all the new "HTTP/4" vs "HTTP/3.1" >> vs "Java" for the HTTP Request HTTP Client sampler, none of that made >> any difference. >> >> Is JMeter 2.5 known for being slower? Or is there something I should >> be examining in my configuration to get it to speed up? >> The script consists of a 'master' script that includes a 'Registration' >> sampler script. The 'master' script just allows me to setup the test >> environment for my run, and having the 'registration' part included >> lets me use that registration script in many different test plans. >> >> The registration script performs the following actions: >> >> HTTP Request HTTP Client - GET the homepage HTTP Request HTTP Client >> - GET the registration form page HTTP Request HTTP Client - GET the >> an AJAX request for our 'username suggestor' >> HTTP Request HTTP Client - POST registration form HTTP Request HTTP >> Client >> - GET the homepage (as logged in user - based on JSESSIONID cookie) >> >> I generally run it with 100 threads, and have had very consistent >> results when running it under JMeter 2.4. So this is really more of a >> question about JMeter 2.5, and whether it added 50% more overhead to >> the system somewhere, or if I am likely to have a configuration >> problem that is throwing in extra delays. > > > - > To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org > For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org > > - To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org
RE: JMeter 2.5 performance
Sebb asked: >How do the average elapsed times compare? The average times are about 50% longer than with 2.4 - so where I was averaging ~1000ms for the 'submit' (POST) action, it is now showing about 1500 to 1750ms. (I am mostly concerned with the 'POST' because that's where all the real work occurs on our server side.) >Also what about min/max std.dev? Those varied a lot from one run to the next anyway - just because I've always been dealing with very short durations - but I did notice that the 'max' seemed to consistently read higher than my prior benchmarks - but only by 1-2 seconds (1000-2000ms). Given the variables in the system - that doesn't seem too outrageous to me. >The http sampler code was re-organised for 2.5; additional classes were added and there is another code layer, >but I'd be surprised if that had a significant effect. I wonder if that mattered - it sure appears to be making a big difference. I think I could re-work the script to use AJP (we're hitting 'tomcat' through an apache front-end right now - and our apache uses AJP), but I have no prior benchmarks to compare AJP with... -- Robin D. Wilson Sr. Director of Web Development KingsIsle Entertainment, Inc. www.KingsIsle.com On 30 August 2011 20:19, Robin D. Wilson wrote: > I have a JMeter script that performs registrations on our website. On > JMeter 2.4, I am able to run this script with a throughput of ~160 > requests per second. With no other differences in the script, I run it > on JMeter 2.5 and only get about 78 requests per second (peak). > > I checked and modified configs for all the new "HTTP/4" vs "HTTP/3.1" > vs "Java" for the HTTP Request HTTP Client sampler, none of that made > any difference. > > Is JMeter 2.5 known for being slower? Or is there something I should > be examining in my configuration to get it to speed up? > The script consists of a 'master' script that includes a 'Registration' > sampler script. The 'master' script just allows me to setup the test > environment for my run, and having the 'registration' part included > lets me use that registration script in many different test plans. > > The registration script performs the following actions: > > HTTP Request HTTP Client - GET the homepage HTTP Request HTTP Client > - GET the registration form page HTTP Request HTTP Client - GET the > an AJAX request for our 'username suggestor' > HTTP Request HTTP Client - POST registration form HTTP Request HTTP > Client > - GET the homepage (as logged in user - based on JSESSIONID cookie) > > I generally run it with 100 threads, and have had very consistent > results when running it under JMeter 2.4. So this is really more of a > question about JMeter 2.5, and whether it added 50% more overhead to > the system somewhere, or if I am likely to have a configuration > problem that is throwing in extra delays. - To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org
Re: JMeter 2.5 performance
On 30 August 2011 20:19, Robin D. Wilson wrote: > I have a JMeter script that performs registrations on our website. On JMeter > 2.4, I am able to run this script with a throughput of ~160 requests per > second. With no other differences in the script, I run it on JMeter 2.5 and > only get about 78 requests per second (peak). > > I checked and modified configs for all the new "HTTP/4" vs "HTTP/3.1" vs > "Java" for the HTTP Request HTTP Client sampler, none of that made any > difference. > > Is JMeter 2.5 known for being slower? Or is there something I should be > examining in my configuration to get it to speed up? The HttpClient4 code in 2.5 has a bug that means it does not re-use connections; this causes it to quickly run out. However that should not cause a slowdown. > The script consists of a 'master' script that includes a 'Registration' > sampler script. The 'master' script just allows me to setup the test > environment for my run, and having the 'registration' part included lets me > use that registration script in many different test plans. > > The registration script performs the following actions: > > HTTP Request HTTP Client - GET the homepage HTTP Request HTTP Client - GET > the registration form page HTTP Request HTTP Client - GET the an AJAX > request for our 'username suggestor' > HTTP Request HTTP Client - POST registration form HTTP Request HTTP Client > - GET the homepage (as logged in user - based on JSESSIONID cookie) > > I generally run it with 100 threads, and have had very consistent results > when running it under JMeter 2.4. So this is really more of a question about > JMeter 2.5, and whether it added 50% more overhead to the system somewhere, > or if I am likely to have a configuration problem that is throwing in extra > delays. How do the average elapsed times compare? Also what about min/max std.dev? The http sampler code was re-organised for 2.5; additional classes were added and there is another code layer, but I'd be surprised if that had a significant effect. > -- > Robin D. Wilson > Sr. Director of Web Development > KingsIsle Entertainment, Inc. > www.KingsIsle.com > > > > > - > To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org > For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org > > - To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org