well you usually do get that off the 1000 , 100 will do search , another 100 will perform ordering flow , the rest browse the site or something similar to that.
>It's MUCH easier to try and find the maximum tolerable load measured in (e.g.) page interactions per second. Yes but you cant really do anything with that number. Shared Code interacts in complex ways - Its your second step thats the important one and given time constraints that all projects suffer it might not be practical to do what you propose (Im not disagreeing with your approach by the way). >It's MUCH easier to try and find the maximum tolerable load measured in (e.g.) page interactions per second. On Wed, Jan 5, 2011 at 3:26 AM, Felix Frank <[email protected]> wrote: > On 01/05/2011 12:55 AM, chinni20 wrote: > > > > Hi Deepak, > > > > I appreciate for your patience and thanks for your concerns! > > > > My ultimate application goal is to Transaction Response time should not > > exceed 2 secs when 1000 users trying access the site simultaneously by > login > > into the site. > > Basically its playbill. > > > > Please advise me! > > From my (limited) experience, it's excruciatingly difficult to model the > load induced by "1000 users accessing the site". What does each user do? > How fast does each user navigate? How many possible interactions are > there? etcpp. > > It's MUCH easier to try and find the maximum tolerable load measured in > (e.g.) page interactions per second. Just raise the induced load until > your transaction time exceeds your desirable limit. Then examine the > page interactions per second your servers had been logging at that point. > > In a second step, try and make an estimate of how many page interactions > per second a given number of concurrent users inflict. Then you can > estimate the number of tolerable concurrent users. > > HTH, > Felix > > --------------------------------------------------------------------- > To unsubscribe, e-mail: [email protected] > For additional commands, e-mail: [email protected] > >

