Remote testing

2013-07-01 Thread Kostadin Georgiev
Hello Jmeter team,

First i want to say, good work for Jmeter!
My idea is to test my server to simulate real users making traffic to the
server.For that case i have installed Jmeter servers on different machines
which are not in one sub-net, they are located all over the world, that is
my idea to simulate real users who made traffic to my server and i couldn't
do it.The problem that i occurs is that Jmeter didn't work if the client
and servers are located on different networks.Is there any way to make
Jmeter works on this scenario , or Jmeter works only if the client and
servers are located in one sub-net?

Greetings,
Kostadin


Re: Read CSV parameter in control loop instead of threadGroup

2013-07-01 Thread ankush upadhyay
After adding User parameter inside while loop for reading csv, its working
fine for me.


On Sat, Jun 29, 2013 at 8:35 PM, Deepak Shetty shet...@gmail.com wrote:

 then likely you havent configured your CSV data set correctly - did you
 check jmeter.log? (Also Im assuming you used the correct syntax ..
 ${__javaScript(${itemId}!=EOF)} - as well as itemId is case sensitive
 , it must be defined the same way in the csv data set config



 On Sat, Jun 29, 2013 at 7:51 AM, ankush upadhyay
 ankush.upadh...@gmail.comwrote:

  Yes I have moved it to under while controller
  Also adding condition: ${itemId} != EOF
 
 
  On Sat, Jun 29, 2013 at 8:02 PM, Deepak Shetty shet...@gmail.com
 wrote:
 
   Hi
   are you sure the CSV data set config is a child of the while
 controller?
  
   regards
   deepak
  
  
   On Sat, Jun 29, 2013 at 7:28 AM, ankush upadhyay
   ankush.upadh...@gmail.comwrote:
  
Thanks for the  quick reply Deepak,
   
I used the while controller but unfortunately it is still read first
   value
for each time, means not iterate with other values.
   
   
On Sat, Jun 29, 2013 at 6:08 PM, Deepak Shetty shet...@gmail.com
   wrote:
   
 WhileController (${someVariableFromCSV} != EOF)
 +HTTP Sampler
 ...
 +CSV DataSetConfig (recycle on eof false)



 On Fri, Jun 28, 2013 at 11:14 PM, ankush upadhyay 
 ankush.upadh...@gmail.com
  wrote:

  Hello all,
 
  I am newbie in Jmeter and creating new test case. The scenario is
sending
  HTTP request for adding some items in cart. Here I want to
  read items from CSV so that user can easily change or modify
 items
   from
  file instead of from jmeter script. It is working in ThreadGroup
   level
 but
  not inside loop.
  --
  --
  Regards
  @Ankush Upadhyay@
 

   
   
   
--
--
Regards
@Ankush Upadhyay@
   
  
 
 
 
  --
  --
  Regards
  @Ankush Upadhyay@
 




-- 
--
Regards
@Ankush Upadhyay@


Re: Read CSV parameter in control loop instead of threadGroup

2013-07-01 Thread ankush upadhyay
Also adding CSV Data Set Config outside the while loop for csv parameter
name.

Thanks Deepak for your great help.


On Mon, Jul 1, 2013 at 1:12 PM, ankush upadhyay
ankush.upadh...@gmail.comwrote:

 After adding User parameter inside while loop for reading csv, its working
 fine for me.


 On Sat, Jun 29, 2013 at 8:35 PM, Deepak Shetty shet...@gmail.com wrote:

 then likely you havent configured your CSV data set correctly - did you
 check jmeter.log? (Also Im assuming you used the correct syntax ..
 ${__javaScript(${itemId}!=EOF)} - as well as itemId is case
 sensitive
 , it must be defined the same way in the csv data set config



 On Sat, Jun 29, 2013 at 7:51 AM, ankush upadhyay
 ankush.upadh...@gmail.comwrote:

  Yes I have moved it to under while controller
  Also adding condition: ${itemId} != EOF
 
 
  On Sat, Jun 29, 2013 at 8:02 PM, Deepak Shetty shet...@gmail.com
 wrote:
 
   Hi
   are you sure the CSV data set config is a child of the while
 controller?
  
   regards
   deepak
  
  
   On Sat, Jun 29, 2013 at 7:28 AM, ankush upadhyay
   ankush.upadh...@gmail.comwrote:
  
Thanks for the  quick reply Deepak,
   
I used the while controller but unfortunately it is still read first
   value
for each time, means not iterate with other values.
   
   
On Sat, Jun 29, 2013 at 6:08 PM, Deepak Shetty shet...@gmail.com
   wrote:
   
 WhileController (${someVariableFromCSV} != EOF)
 +HTTP Sampler
 ...
 +CSV DataSetConfig (recycle on eof false)



 On Fri, Jun 28, 2013 at 11:14 PM, ankush upadhyay 
 ankush.upadh...@gmail.com
  wrote:

  Hello all,
 
  I am newbie in Jmeter and creating new test case. The scenario
 is
sending
  HTTP request for adding some items in cart. Here I want to
  read items from CSV so that user can easily change or modify
 items
   from
  file instead of from jmeter script. It is working in ThreadGroup
   level
 but
  not inside loop.
  --
  --
  Regards
  @Ankush Upadhyay@
 

   
   
   
--
--
Regards
@Ankush Upadhyay@
   
  
 
 
 
  --
  --
  Regards
  @Ankush Upadhyay@
 




 --
 --
 Regards
 @Ankush Upadhyay@




-- 
--
Regards
@Ankush Upadhyay@


Re: Remote testing

2013-07-01 Thread Deepak Goel
Hey

What is the error Jmeter giving when it is run on different network or
subnet. Usually it is the router's job to create a seamless
experience, where user does not come to know how many subnets he has
to work on.

:)
Deepak

On 7/1/13, Kostadin Georgiev k.georg...@cst-bg.net wrote:
 Hello Jmeter team,

 First i want to say, good work for Jmeter!
 My idea is to test my server to simulate real users making traffic to the
 server.For that case i have installed Jmeter servers on different machines
 which are not in one sub-net, they are located all over the world, that is
 my idea to simulate real users who made traffic to my server and i couldn't
 do it.The problem that i occurs is that Jmeter didn't work if the client
 and servers are located on different networks.Is there any way to make
 Jmeter works on this scenario , or Jmeter works only if the client and
 servers are located in one sub-net?

 Greetings,
 Kostadin



-- 
Namaskara~Nalama~Guten Tag~Bonjour


   --
Keigu

Deepak
7350012833
deic...@gmail.com
http://www.simtree.net

Skype: thumsupdeicool
Google talk: deicool
Blog: http://loveandfearless.wordpress.com
Facebook: http://www.facebook.com/deicool

Contribute to the world, environment and more : http://www.gridrepublic.org


-
To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
For additional commands, e-mail: user-h...@jmeter.apache.org



Re: Response Time

2013-07-01 Thread Asheesh
Thanks for the clarification. However in case of automated Web
services/SOAP testing. There's no induced gap like a user thinking. In that
scenario there should not be any think time.

Is this assumption correct.

Regards
Asheesh


On Mon, Jul 1, 2013 at 1:21 PM, Deepak Goel deic...@gmail.com wrote:

 Hey

 'Elapsed Time' should be equal to 'Response Time' plus 'Think Time'
 (user thinks before he clicks for next action). Please subtract 'Think
 Time' from 'Elapsed Time' and you should have what you are looking out
 for.

 :)
 Deepak

 On 7/1/13, Asheesh asheesh.mat...@gmail.com wrote:
  Hi,
  We want to capture Response Time and Throughput for web services. So far
 we
  have been using Aggregate  Report for the same.
  However I came to know that the time mentioned in the report (Max.Min,
 90%
  Percentile) is elapsed time and not the response time.
 
  Elapsed time is not same as response time.
 
  Please advise and guide.
 
  --
  Regards
  Asheesh
 


 --
 Namaskara~Nalama~Guten Tag~Bonjour


--
 Keigu

 Deepak
 7350012833
 deic...@gmail.com
 http://www.simtree.net

 Skype: thumsupdeicool
 Google talk: deicool
 Blog: http://loveandfearless.wordpress.com
 Facebook: http://www.facebook.com/deicool

 Contribute to the world, environment and more :
 http://www.gridrepublic.org
 

 -
 To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
 For additional commands, e-mail: user-h...@jmeter.apache.org




-- 
Regards
Asheesh


Re: Remote testing

2013-07-01 Thread Kostadin Georgiev
There is no error.The problem is that Jmeter works like createing two
Sockets,1 - for receiveing data, 2 - for sending data.When the client and
server are located on different networks,in my case the server have public
ip ,and the Client is NAT over the router with different public ip.I
configuted the server to receives the data from client but can't send it to
him.The server tries to send data to the local ip of the client which is
impossible.Is there anyway to make Jmeter works on this scenario, yea it
works perfect when both client and server are in one subnet but that is not
remote testing, you can't simulate real traffic like this because all
servers are NAT with one public ip so that is 1 user for the test :)


Re: Response Time

2013-07-01 Thread Deepak Goel
Hey

When you record your script, Jmeter or any other tool takes in 'Think
Time' while you record the script which is to be run. In case 'Think
Time = 0' then elapsed time is response time.

:)
Deepak

On 7/1/13, Asheesh asheesh.mat...@gmail.com wrote:
 Thanks for the clarification. However in case of automated Web
 services/SOAP testing. There's no induced gap like a user thinking. In that
 scenario there should not be any think time.

 Is this assumption correct.

 Regards
 Asheesh


 On Mon, Jul 1, 2013 at 1:21 PM, Deepak Goel deic...@gmail.com wrote:

 Hey

 'Elapsed Time' should be equal to 'Response Time' plus 'Think Time'
 (user thinks before he clicks for next action). Please subtract 'Think
 Time' from 'Elapsed Time' and you should have what you are looking out
 for.

 :)
 Deepak

 On 7/1/13, Asheesh asheesh.mat...@gmail.com wrote:
  Hi,
  We want to capture Response Time and Throughput for web services. So
  far
 we
  have been using Aggregate  Report for the same.
  However I came to know that the time mentioned in the report (Max.Min,
 90%
  Percentile) is elapsed time and not the response time.
 
  Elapsed time is not same as response time.
 
  Please advise and guide.
 
  --
  Regards
  Asheesh
 


 --
 Namaskara~Nalama~Guten Tag~Bonjour


--
 Keigu

 Deepak
 7350012833
 deic...@gmail.com
 http://www.simtree.net

 Skype: thumsupdeicool
 Google talk: deicool
 Blog: http://loveandfearless.wordpress.com
 Facebook: http://www.facebook.com/deicool

 Contribute to the world, environment and more :
 http://www.gridrepublic.org
 

 -
 To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
 For additional commands, e-mail: user-h...@jmeter.apache.org




 --
 Regards
 Asheesh



-- 
Namaskara~Nalama~Guten Tag~Bonjour


   --
Keigu

Deepak
7350012833
deic...@gmail.com
http://www.simtree.net

Skype: thumsupdeicool
Google talk: deicool
Blog: http://loveandfearless.wordpress.com
Facebook: http://www.facebook.com/deicool

Contribute to the world, environment and more : http://www.gridrepublic.org


-
To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
For additional commands, e-mail: user-h...@jmeter.apache.org



Re: Response Time

2013-07-01 Thread sebb
On 1 July 2013 07:51, Asheesh asheesh.mat...@gmail.com wrote:
 Hi,
 We want to capture Response Time and Throughput for web services. So far we
 have been using Aggregate  Report for the same.
 However I came to know that the time mentioned in the report (Max.Min, 90%
 Percentile) is elapsed time and not the response time.

Where was this information stated?

 Elapsed time is not same as response time.

http://jmeter.apache.org/usermanual/glossary.html

 Please advise and guide.

 --
 Regards
 Asheesh

-
To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
For additional commands, e-mail: user-h...@jmeter.apache.org



Re: Response Time

2013-07-01 Thread Asheesh
Yes, I got this details from the Glossary Link you mentioned.
However the glossary and other JMeter documentation is silent about
capturing Response Time.
We have to find the response time instead.
Wandering whether in our scenario, wherein there's access to web service
and no user intervention is involved, can we assume Elapsed Time as
Response Time.

Regards
Asheesh


On Mon, Jul 1, 2013 at 2:08 PM, sebb seb...@gmail.com wrote:

 On 1 July 2013 07:51, Asheesh asheesh.mat...@gmail.com wrote:
  Hi,
  We want to capture Response Time and Throughput for web services. So far
 we
  have been using Aggregate  Report for the same.
  However I came to know that the time mentioned in the report (Max.Min,
 90%
  Percentile) is elapsed time and not the response time.

 Where was this information stated?

  Elapsed time is not same as response time.

 http://jmeter.apache.org/usermanual/glossary.html

  Please advise and guide.
 
  --
  Regards
  Asheesh

 -
 To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
 For additional commands, e-mail: user-h...@jmeter.apache.org




-- 
Regards
Asheesh


Re: JMeter - NTLM authentication

2013-07-01 Thread mrgilbe1
Hi Vikrams9,

I could only get NTLM auth to work by recording using the Java request
implementation, and playing back with the httpclient 3.1 implementation.

Try using Fiddler to trap the traffic back and forth, both when you connect
for real and when you use JMeter to connect.  That might give you an idea
exactly what is going wrong.




--
View this message in context: 
http://jmeter.512774.n5.nabble.com/JMeter-NTLM-authentication-tp514319p5717312.html
Sent from the JMeter - User mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
For additional commands, e-mail: user-h...@jmeter.apache.org



Re: Response Time

2013-07-01 Thread sebb
On 1 July 2013 11:06, Asheesh asheesh.mat...@gmail.com wrote:
 Interesting question !
 At several instances JMeter documentation explicitly talks about response
 time, there's a Response Time Graph as well.
 However in glossary and several other charts the term elapsed time is
 mentioned.
 For example in  Aggregate Report that I am capturing there's no explicit
 mention, whether the time displayed is Elapsed or Response Time.


So: what metric are you actually looking for?



 On Mon, Jul 1, 2013 at 2:45 PM, sebb seb...@gmail.com wrote:

 On 1 July 2013 09:45, Asheesh asheesh.mat...@gmail.com wrote:
  Yes, I got this details from the Glossary Link you mentioned.
  However the glossary and other JMeter documentation is silent about
  capturing Response Time.

 What do you mean by Response time?
 How does it differ from the Elapsed time as documented in the Glossary?

  We have to find the response time instead.
  Wandering whether in our scenario, wherein there's access to web service
  and no user intervention is involved, can we assume Elapsed Time as
  Response Time.
 
  Regards
  Asheesh
 
 
  On Mon, Jul 1, 2013 at 2:08 PM, sebb seb...@gmail.com wrote:
 
  On 1 July 2013 07:51, Asheesh asheesh.mat...@gmail.com wrote:
   Hi,
   We want to capture Response Time and Throughput for web services. So
 far
  we
   have been using Aggregate  Report for the same.
   However I came to know that the time mentioned in the report (Max.Min,
  90%
   Percentile) is elapsed time and not the response time.
 
  Where was this information stated?
 
   Elapsed time is not same as response time.
 
  http://jmeter.apache.org/usermanual/glossary.html
 
   Please advise and guide.
  
   --
   Regards
   Asheesh
 
  -
  To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
  For additional commands, e-mail: user-h...@jmeter.apache.org
 
 
 
 
  --
  Regards
  Asheesh

 -
 To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
 For additional commands, e-mail: user-h...@jmeter.apache.org




 --
 Regards
 Asheesh

-
To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
For additional commands, e-mail: user-h...@jmeter.apache.org



Re: Read CSV parameter in control loop instead of threadGroup

2013-07-01 Thread ankush upadhyay
${__javaScript(${itemId}!=EOF)}  condition  reaching to infinite loop.


On Sat, Jun 29, 2013 at 8:35 PM, Deepak Shetty shet...@gmail.com wrote:

 then likely you havent configured your CSV data set correctly - did you
 check jmeter.log? (Also Im assuming you used the correct syntax ..
 ${__javaScript(${itemId}!=EOF)} - as well as itemId is case sensitive
 , it must be defined the same way in the csv data set config



 On Sat, Jun 29, 2013 at 7:51 AM, ankush upadhyay
 ankush.upadh...@gmail.comwrote:

  Yes I have moved it to under while controller
  Also adding condition: ${itemId} != EOF
 
 
  On Sat, Jun 29, 2013 at 8:02 PM, Deepak Shetty shet...@gmail.com
 wrote:
 
   Hi
   are you sure the CSV data set config is a child of the while
 controller?
  
   regards
   deepak
  
  
   On Sat, Jun 29, 2013 at 7:28 AM, ankush upadhyay
   ankush.upadh...@gmail.comwrote:
  
Thanks for the  quick reply Deepak,
   
I used the while controller but unfortunately it is still read first
   value
for each time, means not iterate with other values.
   
   
On Sat, Jun 29, 2013 at 6:08 PM, Deepak Shetty shet...@gmail.com
   wrote:
   
 WhileController (${someVariableFromCSV} != EOF)
 +HTTP Sampler
 ...
 +CSV DataSetConfig (recycle on eof false)



 On Fri, Jun 28, 2013 at 11:14 PM, ankush upadhyay 
 ankush.upadh...@gmail.com
  wrote:

  Hello all,
 
  I am newbie in Jmeter and creating new test case. The scenario is
sending
  HTTP request for adding some items in cart. Here I want to
  read items from CSV so that user can easily change or modify
 items
   from
  file instead of from jmeter script. It is working in ThreadGroup
   level
 but
  not inside loop.
  --
  --
  Regards
  @Ankush Upadhyay@
 

   
   
   
--
--
Regards
@Ankush Upadhyay@
   
  
 
 
 
  --
  --
  Regards
  @Ankush Upadhyay@
 




-- 
--
Regards
@Ankush Upadhyay@


Re: Response Time

2013-07-01 Thread sebb
On 1 July 2013 11:23, Asheesh asheesh.mat...@gmail.com wrote:
 We are looking forward to capture Response Time and Throughput.

Yes,  you already wrote that.

But what do you actually mean by Response Time?
Is it any different from Elapsed Time as documented in the Glossary?
If so, how does it differ?


 On Mon, Jul 1, 2013 at 3:46 PM, sebb seb...@gmail.com wrote:

 On 1 July 2013 11:06, Asheesh asheesh.mat...@gmail.com wrote:
  Interesting question !
  At several instances JMeter documentation explicitly talks about response
  time, there's a Response Time Graph as well.
  However in glossary and several other charts the term elapsed time is
  mentioned.
  For example in  Aggregate Report that I am capturing there's no explicit
  mention, whether the time displayed is Elapsed or Response Time.
 

 So: what metric are you actually looking for?

 
 
  On Mon, Jul 1, 2013 at 2:45 PM, sebb seb...@gmail.com wrote:
 
  On 1 July 2013 09:45, Asheesh asheesh.mat...@gmail.com wrote:
   Yes, I got this details from the Glossary Link you mentioned.
   However the glossary and other JMeter documentation is silent about
   capturing Response Time.
 
  What do you mean by Response time?
  How does it differ from the Elapsed time as documented in the Glossary?
 
   We have to find the response time instead.
   Wandering whether in our scenario, wherein there's access to web
 service
   and no user intervention is involved, can we assume Elapsed Time as
   Response Time.
  
   Regards
   Asheesh
  
  
   On Mon, Jul 1, 2013 at 2:08 PM, sebb seb...@gmail.com wrote:
  
   On 1 July 2013 07:51, Asheesh asheesh.mat...@gmail.com wrote:
Hi,
We want to capture Response Time and Throughput for web services.
 So
  far
   we
have been using Aggregate  Report for the same.
However I came to know that the time mentioned in the report
 (Max.Min,
   90%
Percentile) is elapsed time and not the response time.
  
   Where was this information stated?
  
Elapsed time is not same as response time.
  
   http://jmeter.apache.org/usermanual/glossary.html
  
Please advise and guide.
   
--
Regards
Asheesh
  
   -
   To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
   For additional commands, e-mail: user-h...@jmeter.apache.org
  
  
  
  
   --
   Regards
   Asheesh
 
  -
  To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
  For additional commands, e-mail: user-h...@jmeter.apache.org
 
 
 
 
  --
  Regards
  Asheesh

 -
 To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
 For additional commands, e-mail: user-h...@jmeter.apache.org




 --
 Regards
 Asheesh

-
To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
For additional commands, e-mail: user-h...@jmeter.apache.org



Re: Response Time

2013-07-01 Thread Asheesh
Considering different terms used in JMeter documentation, I am not sure
whether whether the Glossary's definition on Elapsed Time is same as that
of Response Time in SOAP/Web Services.
My understanding of Response Time is the time lag between issuing a request
and receiving the complete response.



On Mon, Jul 1, 2013 at 3:57 PM, sebb seb...@gmail.com wrote:

 On 1 July 2013 11:23, Asheesh asheesh.mat...@gmail.com wrote:
  We are looking forward to capture Response Time and Throughput.

 Yes,  you already wrote that.

 But what do you actually mean by Response Time?
 Is it any different from Elapsed Time as documented in the Glossary?
 If so, how does it differ?

 
  On Mon, Jul 1, 2013 at 3:46 PM, sebb seb...@gmail.com wrote:
 
  On 1 July 2013 11:06, Asheesh asheesh.mat...@gmail.com wrote:
   Interesting question !
   At several instances JMeter documentation explicitly talks about
 response
   time, there's a Response Time Graph as well.
   However in glossary and several other charts the term elapsed time is
   mentioned.
   For example in  Aggregate Report that I am capturing there's no
 explicit
   mention, whether the time displayed is Elapsed or Response Time.
  
 
  So: what metric are you actually looking for?
 
  
  
   On Mon, Jul 1, 2013 at 2:45 PM, sebb seb...@gmail.com wrote:
  
   On 1 July 2013 09:45, Asheesh asheesh.mat...@gmail.com wrote:
Yes, I got this details from the Glossary Link you mentioned.
However the glossary and other JMeter documentation is silent about
capturing Response Time.
  
   What do you mean by Response time?
   How does it differ from the Elapsed time as documented in the
 Glossary?
  
We have to find the response time instead.
Wandering whether in our scenario, wherein there's access to web
  service
and no user intervention is involved, can we assume Elapsed Time as
Response Time.
   
Regards
Asheesh
   
   
On Mon, Jul 1, 2013 at 2:08 PM, sebb seb...@gmail.com wrote:
   
On 1 July 2013 07:51, Asheesh asheesh.mat...@gmail.com wrote:
 Hi,
 We want to capture Response Time and Throughput for web
 services.
  So
   far
we
 have been using Aggregate  Report for the same.
 However I came to know that the time mentioned in the report
  (Max.Min,
90%
 Percentile) is elapsed time and not the response time.
   
Where was this information stated?
   
 Elapsed time is not same as response time.
   
http://jmeter.apache.org/usermanual/glossary.html
   
 Please advise and guide.

 --
 Regards
 Asheesh
   
   
 -
To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
For additional commands, e-mail: user-h...@jmeter.apache.org
   
   
   
   
--
Regards
Asheesh
  
   -
   To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
   For additional commands, e-mail: user-h...@jmeter.apache.org
  
  
  
  
   --
   Regards
   Asheesh
 
  -
  To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
  For additional commands, e-mail: user-h...@jmeter.apache.org
 
 
 
 
  --
  Regards
  Asheesh

 -
 To unsubscribe, e-mail: user-unsubscr...@jmeter.apache.org
 For additional commands, e-mail: user-h...@jmeter.apache.org




-- 
Regards
Asheesh


Re: Read CSV parameter in control loop instead of threadGroup

2013-07-01 Thread Deepak Shetty
you either have a syntax error (check jmeter.log) or your CSV data set
config is wrong (you have set it to recycle at EOF..


On Mon, Jul 1, 2013 at 3:20 AM, ankush upadhyay
ankush.upadh...@gmail.comwrote:

 ${__javaScript(${itemId}!=EOF)}  condition  reaching to infinite
 loop.


 On Sat, Jun 29, 2013 at 8:35 PM, Deepak Shetty shet...@gmail.com wrote:

  then likely you havent configured your CSV data set correctly - did you
  check jmeter.log? (Also Im assuming you used the correct syntax ..
  ${__javaScript(${itemId}!=EOF)} - as well as itemId is case
 sensitive
  , it must be defined the same way in the csv data set config
 
 
 
  On Sat, Jun 29, 2013 at 7:51 AM, ankush upadhyay
  ankush.upadh...@gmail.comwrote:
 
   Yes I have moved it to under while controller
   Also adding condition: ${itemId} != EOF
  
  
   On Sat, Jun 29, 2013 at 8:02 PM, Deepak Shetty shet...@gmail.com
  wrote:
  
Hi
are you sure the CSV data set config is a child of the while
  controller?
   
regards
deepak
   
   
On Sat, Jun 29, 2013 at 7:28 AM, ankush upadhyay
ankush.upadh...@gmail.comwrote:
   
 Thanks for the  quick reply Deepak,

 I used the while controller but unfortunately it is still read
 first
value
 for each time, means not iterate with other values.


 On Sat, Jun 29, 2013 at 6:08 PM, Deepak Shetty shet...@gmail.com
wrote:

  WhileController (${someVariableFromCSV} != EOF)
  +HTTP Sampler
  ...
  +CSV DataSetConfig (recycle on eof false)
 
 
 
  On Fri, Jun 28, 2013 at 11:14 PM, ankush upadhyay 
  ankush.upadh...@gmail.com
   wrote:
 
   Hello all,
  
   I am newbie in Jmeter and creating new test case. The scenario
 is
 sending
   HTTP request for adding some items in cart. Here I want to
   read items from CSV so that user can easily change or modify
  items
from
   file instead of from jmeter script. It is working in
 ThreadGroup
level
  but
   not inside loop.
   --
   --
   Regards
   @Ankush Upadhyay@
  
 



 --
 --
 Regards
 @Ankush Upadhyay@

   
  
  
  
   --
   --
   Regards
   @Ankush Upadhyay@
  
 



 --
 --
 Regards
 @Ankush Upadhyay@



Re: Read CSV parameter in control loop instead of threadGroup

2013-07-01 Thread ankush upadhyay
Here is my configuration:
while controller:
  -- CSV Data Set Config
File name: Item.csv ( this is on same folder where my script reside)

Variable names: ${__javaScript(${itemId}!=EOF)}
Recycle on EOF: true
Stop thread on EOF: true

-- http request sampler

   I need the new item value each time from csv and also end after
last item.



On Mon, Jul 1, 2013 at 10:29 PM, Deepak Shetty shet...@gmail.com wrote:

 you either have a syntax error (check jmeter.log) or your CSV data set
 config is wrong (you have set it to recycle at EOF..


 On Mon, Jul 1, 2013 at 3:20 AM, ankush upadhyay
 ankush.upadh...@gmail.comwrote:

  ${__javaScript(${itemId}!=EOF)}  condition  reaching to infinite
  loop.
 
 
  On Sat, Jun 29, 2013 at 8:35 PM, Deepak Shetty shet...@gmail.com
 wrote:
 
   then likely you havent configured your CSV data set correctly - did you
   check jmeter.log? (Also Im assuming you used the correct syntax ..
   ${__javaScript(${itemId}!=EOF)} - as well as itemId is case
  sensitive
   , it must be defined the same way in the csv data set config
  
  
  
   On Sat, Jun 29, 2013 at 7:51 AM, ankush upadhyay
   ankush.upadh...@gmail.comwrote:
  
Yes I have moved it to under while controller
Also adding condition: ${itemId} != EOF
   
   
On Sat, Jun 29, 2013 at 8:02 PM, Deepak Shetty shet...@gmail.com
   wrote:
   
 Hi
 are you sure the CSV data set config is a child of the while
   controller?

 regards
 deepak


 On Sat, Jun 29, 2013 at 7:28 AM, ankush upadhyay
 ankush.upadh...@gmail.comwrote:

  Thanks for the  quick reply Deepak,
 
  I used the while controller but unfortunately it is still read
  first
 value
  for each time, means not iterate with other values.
 
 
  On Sat, Jun 29, 2013 at 6:08 PM, Deepak Shetty 
 shet...@gmail.com
 wrote:
 
   WhileController (${someVariableFromCSV} != EOF)
   +HTTP Sampler
   ...
   +CSV DataSetConfig (recycle on eof false)
  
  
  
   On Fri, Jun 28, 2013 at 11:14 PM, ankush upadhyay 
   ankush.upadh...@gmail.com
wrote:
  
Hello all,
   
I am newbie in Jmeter and creating new test case. The
 scenario
  is
  sending
HTTP request for adding some items in cart. Here I want to
read items from CSV so that user can easily change or modify
   items
 from
file instead of from jmeter script. It is working in
  ThreadGroup
 level
   but
not inside loop.
--
--
Regards
@Ankush Upadhyay@
   
  
 
 
 
  --
  --
  Regards
  @Ankush Upadhyay@
 

   
   
   
--
--
Regards
@Ankush Upadhyay@
   
  
 
 
 
  --
  --
  Regards
  @Ankush Upadhyay@
 




-- 
--
Regards
@Ankush Upadhyay@


Re: Read CSV parameter in control loop instead of threadGroup

2013-07-01 Thread Deepak Shetty
if you have Recycle on EOF = true then yes the variable never gets the EOF
value and so you cannot use this condition to terminate ...if you want the
loop to stop when the CSV is processed then you either change or change the
way you determine when to stop


On Mon, Jul 1, 2013 at 10:32 AM, ankush upadhyay
ankush.upadh...@gmail.comwrote:

 Here is my configuration:
 while controller:
   -- CSV Data Set Config
 File name: Item.csv ( this is on same folder where my script reside)

 Variable names: ${__javaScript(${itemId}!=EOF)}
 Recycle on EOF: true
 Stop thread on EOF: true

 -- http request sampler

I need the new item value each time from csv and also end after
 last item.



 On Mon, Jul 1, 2013 at 10:29 PM, Deepak Shetty shet...@gmail.com wrote:

  you either have a syntax error (check jmeter.log) or your CSV data set
  config is wrong (you have set it to recycle at EOF..
 
 
  On Mon, Jul 1, 2013 at 3:20 AM, ankush upadhyay
  ankush.upadh...@gmail.comwrote:
 
   ${__javaScript(${itemId}!=EOF)}  condition  reaching to infinite
   loop.
  
  
   On Sat, Jun 29, 2013 at 8:35 PM, Deepak Shetty shet...@gmail.com
  wrote:
  
then likely you havent configured your CSV data set correctly - did
 you
check jmeter.log? (Also Im assuming you used the correct syntax ..
${__javaScript(${itemId}!=EOF)} - as well as itemId is case
   sensitive
, it must be defined the same way in the csv data set config
   
   
   
On Sat, Jun 29, 2013 at 7:51 AM, ankush upadhyay
ankush.upadh...@gmail.comwrote:
   
 Yes I have moved it to under while controller
 Also adding condition: ${itemId} != EOF


 On Sat, Jun 29, 2013 at 8:02 PM, Deepak Shetty shet...@gmail.com
wrote:

  Hi
  are you sure the CSV data set config is a child of the while
controller?
 
  regards
  deepak
 
 
  On Sat, Jun 29, 2013 at 7:28 AM, ankush upadhyay
  ankush.upadh...@gmail.comwrote:
 
   Thanks for the  quick reply Deepak,
  
   I used the while controller but unfortunately it is still read
   first
  value
   for each time, means not iterate with other values.
  
  
   On Sat, Jun 29, 2013 at 6:08 PM, Deepak Shetty 
  shet...@gmail.com
  wrote:
  
WhileController (${someVariableFromCSV} != EOF)
+HTTP Sampler
...
+CSV DataSetConfig (recycle on eof false)
   
   
   
On Fri, Jun 28, 2013 at 11:14 PM, ankush upadhyay 
ankush.upadh...@gmail.com
 wrote:
   
 Hello all,

 I am newbie in Jmeter and creating new test case. The
  scenario
   is
   sending
 HTTP request for adding some items in cart. Here I want to
 read items from CSV so that user can easily change or
 modify
items
  from
 file instead of from jmeter script. It is working in
   ThreadGroup
  level
but
 not inside loop.
 --
 --
 Regards
 @Ankush Upadhyay@

   
  
  
  
   --
   --
   Regards
   @Ankush Upadhyay@
  
 



 --
 --
 Regards
 @Ankush Upadhyay@

   
  
  
  
   --
   --
   Regards
   @Ankush Upadhyay@
  
 



 --
 --
 Regards
 @Ankush Upadhyay@



Establishing baseline metrics

2013-07-01 Thread nmq
Hi all

This is not a JMeter specific questions but since this user list comprises
of experts in performance testing, I figured it would be a good place to
ask this question.

My question is how do you establish baselines for a website's performance
if you do not have any historic data?  Lets say this is a new website and
its for a limited number of customers.

How do you determine what should be the number of concurrent users you
should simulate.

Lets say the executives say off at the top of their heads, that the maximum
number of concurrent users would be 50 at peak times. Does that mean I
should not go beyond 50 or should I still do tests with a higher number?

How can I go about establishing baselines for page load times, if I do not
have any historic data and have no industry benchmarks or competitor data.

Would it make sense to say let's see how the website is doing throughout
the development phase and establish our baseline using the current response
times?

I would appreciate any input.


Regards
Sam


Re: Establishing baseline metrics

2013-07-01 Thread Deepak Shetty
Does that mean I should not go beyond 50 or should I still do tests with a
higher number?
You usually have to factor in growth (growing at a rate of X users per
month or whatever with whenever is your next scheduled release where you
could make reasonable amount of optimisations) - i.e. capacity planning.

How can I go about establishing baselines for page load times, if I do not
have any historic data and have no industry benchmarks or competitor data.
There are some usability tests as to what times cause user's to perceive as
slow. (Normal navigation has different values then login or search or more
secure pages like checkout) - you can google these. These are specific to
industry and functionality. (for e.g. we have 2 seconds for search page
(most heavily used part of the website) , 4 seconds for catalog pages,
6seconds for login/secure pages/ pages that have any ERP integration) - but
its better if you actually conduct some usability tests with end usersor if
you can find equivalents for your industry.

Would it make sense to say let's see how the website is doing throughout
the development phase and establish our baseline using the current response
times?
No! you have to determine the times you want(as above) and see that your
current response satisfies those (suppose your current page takes 60
seconds are you satisfied that , that is your baseline?)


On Mon, Jul 1, 2013 at 12:33 PM, nmq nmq0...@gmail.com wrote:

 Hi all

 This is not a JMeter specific questions but since this user list comprises
 of experts in performance testing, I figured it would be a good place to
 ask this question.

 My question is how do you establish baselines for a website's performance
 if you do not have any historic data?  Lets say this is a new website and
 its for a limited number of customers.

 How do you determine what should be the number of concurrent users you
 should simulate.

 Lets say the executives say off at the top of their heads, that the maximum
 number of concurrent users would be 50 at peak times. Does that mean I
 should not go beyond 50 or should I still do tests with a higher number?

 How can I go about establishing baselines for page load times, if I do not
 have any historic data and have no industry benchmarks or competitor data.

 Would it make sense to say let's see how the website is doing throughout
 the development phase and establish our baseline using the current response
 times?

 I would appreciate any input.


 Regards
 Sam



RE: Establishing baseline metrics

2013-07-01 Thread Robin D. Wilson
I'm thinking I look at performance testing differently than a lot of people... 
For me, the objective of performance testing is to
establish what your system _can_ do, not what you need to accomplish. So when 
you are setting up your tests, you are trying to drive
your systems at maximum capacity for some extended period of time. Then you 
measure that capacity as your 'baseline'.

For every subsequent release of your code, you measure it against the 
'baseline', and determine whether the code got faster or
slower. If you determine that the slower (or faster) response is acceptable to 
your end users (because you were nowhere near the
user's acceptable standard), you can reset your baseline to that standard. If 
your slower standard is encroaching on the usability
of the system - you can declare that baseline as the minimum spec, and then 
fail any code that exceeds that standard.

As for how you determine what is acceptable to a 'user', that can be handled in 
a number of ways - without actually improving the
'real' performance of the system. Consider a web page that loads a bunch of 
rows of data in a big table. For most users, if you can
start reading the table within 1-2 seconds, that is acceptable for a system's 
performance. But if there are hundreds of rows of
data, you would not need to load _all_ the rows within 1-2 seconds to actually 
meet their performance criteria. You only need to
load enough rows that the table fills the browser - so they can start reading - 
within the 1-2 second period. JMeter cannot really
measure this timing, it can only measure the 'overall response time' (indeed, I 
don't know any testing tool that can do it). So
trying to define a performance benchmark in terms of what 'users' experience is 
really difficult, and nearly useless (to me anyway).

I look at performance testing as a way to cross-check my development team 
against the perpetual tendency to gum-up the code and slow
things down. So in order to make the testing effective for the developers, I 
need to perf test _very_specific_ things. Trying to
performance test the system as a whole is nearly an impossible task - not 
only because there are so many variables that influence
the tests, but precisely because all of those variables make it impossible to 
debug which one causes the bottleneck when there is
a change in performance from one release to the next. (Have you ever sent your 
programmers off to 'fix' a performance problem that
turned out to be caused by an O/S update on your server? I have...)

Instead, we create performance tests that test specific functional systems. 
That is, the login perf test. The registration perf
test. The ... perf test. Each one of these tests is run independently, so 
that when we encounter a slower benchmark - we can tell
the developers immediately where to concentrate their efforts in fixing the 
problem. (We also monitor all parts of the system (CPU,
IO, Database Transactions (reads, writes, full table scans, etc.) from all 
servers involved in the test. The goal is not to simulate
'real user activity', it is to max out the capacity of at least 1 of the 
servers in the test (specifically the one executing the
'application logic'). If we max out that one server, we know that our 
'benchmark' is the most we can expect of a single member of
our cluster of machines. (We also test a cluster of 2 machines - and measure 
the fall-off in capacity between a 1-member cluster and
2-member cluster, this gives us an idea of how much impact our 'clustering' 
system has on performance as well.) I suppose you could
say that I look at it as if, we measure the 'maximum capacity', and so long as 
the number of users doesn't exceed that - we will
perform OK.

We do run some 'all-encompassing' system tests as well, but those are more for 
'stress' testing than for performance benchmarking.
We are specifically looking for things that start to break-down after hours of 
continuous operation at peak capacity. So we monitor
error logs and look to make sure that we aren't throwing errors while under 
stress.

The number one thing to keep in mind about performance testing is that you have 
to use 'real data'. We actually download our
production database every weekend, and strip out any 'personal information' 
(stuff that we protect in our production environment) by
either nulling it out, or replacing it with bogus data. This allows us to run 
our performance tests against a database that has 100s
of millions of rows of data. Nearly all of our performance 'bugs' have been 
caused by poor data handling in the code (SQL requests
that don't use indices (causing a full table scan), badly formed joins, 
fetching a few rows of data and then looping through them in
the code (when the 'few rows of data' from your 'dev' environment become 
100,000 rows with the production data, this tends to bog
the code down a lot), etc.). So if you are testing with 'faked' data, odds are 
good you will miss a lot of 

Re: Establishing baseline metrics

2013-07-01 Thread Deepak Shetty
Hi
We seem to have differing philosophies :)
For me, the objective of performance testing is to establish what your
system _can_ do, not what you need to accomplish.
One(but not all) of the goals of performance testing is indeed whether or
not your system does what you set out to accomplish. If its the first time
you are releasing something (As the poster implies) you will have a set of
goals you need to accomplish , and if your perf tests indicate that those
goals are not met then you do have to make some change (infrastructure or
code or whatever) and rerun your tests till you meet those goals. You also
have to know when to stop (you stop the optimisations when the goals are
met).

So trying to define a performance benchmark in terms of what 'users'
experience is really difficult, and nearly useless (to me anyway).
You are missing the point here. If you think that X seconds is an
acceptable experience but a majority of users report your website as slow
are you going to dismiss their experience as nearly useless? You have to
have some way of validating your baseline is good enough for your users.


Trying to
performance test the system as a whole is nearly an impossible task -
not only because there are so many variables that influence
the tests, but precisely because all of those variables make it
impossible to debug which one causes the bottleneck when there is
a change in performance from one release to the next.
You are mixing the detection part (the load test) with the analytic or
profiling part. Once you know there is a problem , then you can analyse it
in a way that works for you. But it is absolutely necessary to test the
system as a whole precisely because of those variables.



On Mon, Jul 1, 2013 at 1:32 PM, Robin D. Wilson rwils...@gmail.com wrote:

 I'm thinking I look at performance testing differently than a lot of
 people... For me, the objective of performance testing is to
 establish what your system _can_ do, not what you need to accomplish. So
 when you are setting up your tests, you are trying to drive
 your systems at maximum capacity for some extended period of time. Then
 you measure that capacity as your 'baseline'.

 For every subsequent release of your code, you measure it against the
 'baseline', and determine whether the code got faster or
 slower. If you determine that the slower (or faster) response is
 acceptable to your end users (because you were nowhere near the
 user's acceptable standard), you can reset your baseline to that standard.
 If your slower standard is encroaching on the usability
 of the system - you can declare that baseline as the minimum spec, and
 then fail any code that exceeds that standard.

 As for how you determine what is acceptable to a 'user', that can be
 handled in a number of ways - without actually improving the
 'real' performance of the system. Consider a web page that loads a bunch
 of rows of data in a big table. For most users, if you can
 start reading the table within 1-2 seconds, that is acceptable for a
 system's performance. But if there are hundreds of rows of
 data, you would not need to load _all_ the rows within 1-2 seconds to
 actually meet their performance criteria. You only need to
 load enough rows that the table fills the browser - so they can start
 reading - within the 1-2 second period. JMeter cannot really
 measure this timing, it can only measure the 'overall response time'
 (indeed, I don't know any testing tool that can do it). So
 trying to define a performance benchmark in terms of what 'users'
 experience is really difficult, and nearly useless (to me anyway).

 I look at performance testing as a way to cross-check my development team
 against the perpetual tendency to gum-up the code and slow
 things down. So in order to make the testing effective for the developers,
 I need to perf test _very_specific_ things. Trying to
 performance test the system as a whole is nearly an impossible task -
 not only because there are so many variables that influence
 the tests, but precisely because all of those variables make it
 impossible to debug which one causes the bottleneck when there is
 a change in performance from one release to the next. (Have you ever sent
 your programmers off to 'fix' a performance problem that
 turned out to be caused by an O/S update on your server? I have...)

 Instead, we create performance tests that test specific functional
 systems. That is, the login perf test. The registration perf
 test. The ... perf test. Each one of these tests is run independently,
 so that when we encounter a slower benchmark - we can tell
 the developers immediately where to concentrate their efforts in fixing
 the problem. (We also monitor all parts of the system (CPU,
 IO, Database Transactions (reads, writes, full table scans, etc.) from all
 servers involved in the test. The goal is not to simulate
 'real user activity', it is to max out the capacity of at least 1 of the
 servers in the test (specifically the 

pure language user interface

2013-07-01 Thread 黄吉浩
hi,all

I'm a new Chinese user to jmeter. The user interface language of Jmeter is 
partly English, partly Chinese. It's uncomfortable. 

Is there a way to set the user interface language to totally English? I think 
totally Chinese interface may be unreachable at the moment...

Re: Establishing baseline metrics

2013-07-01 Thread Deepak Goel
For baseline metrics, do a stress test till the system crashes, and
then the 'Number of Users' supported can be the baseline metrics for
you along with other data (processor and memory usage)

On 7/2/13, Deepak Shetty shet...@gmail.com wrote:
 Hi
 We seem to have differing philosophies :)
For me, the objective of performance testing is to establish what your
 system _can_ do, not what you need to accomplish.
 One(but not all) of the goals of performance testing is indeed whether or
 not your system does what you set out to accomplish. If its the first time
 you are releasing something (As the poster implies) you will have a set of
 goals you need to accomplish , and if your perf tests indicate that those
 goals are not met then you do have to make some change (infrastructure or
 code or whatever) and rerun your tests till you meet those goals. You also
 have to know when to stop (you stop the optimisations when the goals are
 met).

So trying to define a performance benchmark in terms of what 'users'
 experience is really difficult, and nearly useless (to me anyway).
 You are missing the point here. If you think that X seconds is an
 acceptable experience but a majority of users report your website as slow
 are you going to dismiss their experience as nearly useless? You have to
 have some way of validating your baseline is good enough for your users.


Trying to
performance test the system as a whole is nearly an impossible task -
 not only because there are so many variables that influence
the tests, but precisely because all of those variables make it
 impossible to debug which one causes the bottleneck when there is
a change in performance from one release to the next.
 You are mixing the detection part (the load test) with the analytic or
 profiling part. Once you know there is a problem , then you can analyse it
 in a way that works for you. But it is absolutely necessary to test the
 system as a whole precisely because of those variables.



 On Mon, Jul 1, 2013 at 1:32 PM, Robin D. Wilson rwils...@gmail.com wrote:

 I'm thinking I look at performance testing differently than a lot of
 people... For me, the objective of performance testing is to
 establish what your system _can_ do, not what you need to accomplish. So
 when you are setting up your tests, you are trying to drive
 your systems at maximum capacity for some extended period of time. Then
 you measure that capacity as your 'baseline'.

 For every subsequent release of your code, you measure it against the
 'baseline', and determine whether the code got faster or
 slower. If you determine that the slower (or faster) response is
 acceptable to your end users (because you were nowhere near the
 user's acceptable standard), you can reset your baseline to that
 standard.
 If your slower standard is encroaching on the usability
 of the system - you can declare that baseline as the minimum spec, and
 then fail any code that exceeds that standard.

 As for how you determine what is acceptable to a 'user', that can be
 handled in a number of ways - without actually improving the
 'real' performance of the system. Consider a web page that loads a bunch
 of rows of data in a big table. For most users, if you can
 start reading the table within 1-2 seconds, that is acceptable for a
 system's performance. But if there are hundreds of rows of
 data, you would not need to load _all_ the rows within 1-2 seconds to
 actually meet their performance criteria. You only need to
 load enough rows that the table fills the browser - so they can start
 reading - within the 1-2 second period. JMeter cannot really
 measure this timing, it can only measure the 'overall response time'
 (indeed, I don't know any testing tool that can do it). So
 trying to define a performance benchmark in terms of what 'users'
 experience is really difficult, and nearly useless (to me anyway).

 I look at performance testing as a way to cross-check my development team
 against the perpetual tendency to gum-up the code and slow
 things down. So in order to make the testing effective for the
 developers,
 I need to perf test _very_specific_ things. Trying to
 performance test the system as a whole is nearly an impossible task -
 not only because there are so many variables that influence
 the tests, but precisely because all of those variables make it
 impossible to debug which one causes the bottleneck when there is
 a change in performance from one release to the next. (Have you ever sent
 your programmers off to 'fix' a performance problem that
 turned out to be caused by an O/S update on your server? I have...)

 Instead, we create performance tests that test specific functional
 systems. That is, the login perf test. The registration perf
 test. The ... perf test. Each one of these tests is run independently,
 so that when we encounter a slower benchmark - we can tell
 the developers immediately where to concentrate their efforts in fixing
 the problem. (We also monitor all parts 

Re: pure language user interface

2013-07-01 Thread Vance Zhao
you may need to take a look to the jmeter.properties files to set the Lang 
properly   

---Sent from Boxer | http://getboxer.com

hi,all



I'm a new Chinese user to jmeter. The user interface language of Jmeter is 
partly English, partly Chinese. It's uncomfortable.  


Is there a way to set the user interface language to totally English? I think 
totally Chinese interface may be unreachable at the moment...