On 11.02.2012 22:03, Tudor Girba wrote:
Hi,

On 11 Feb 2012, at 21:13, Philippe Marschall wrote:

On 11.02.2012 19:30, Tudor Girba wrote:
Hi,

Sorry for the late reply, and thanks everyone for the suggestions.

I did not provide much details because I am new to this domain and I wanted to 
see from the reactions if maybe I am not missing some relevant direction.

We are trying to measure how an Oracle database can cope with an increase in 
usage (basically, there will be more users for the application).

Just more users or also more data? Will the users access the database through 
one application or does each have it's own application?

Both, but users is the more pressing problem. We already have enough data to 
expose problems. It's one application which is a legacy two-tier-Delphi-based 
wrapped with a three-tier-JEE.

That sounds fun. Have you considered load testing through the application instead of directly the database? If the application doesn't scale then it doesn't help if the database does.

We are basing our analysis on typical SQL statements coming from the 
application. We are currently doing load testing by:
- recording sql statements from some use cases that are considered to be 
important
- generalizing them by replacing actual values with generic variables

That shouldn't be necessary, the queries should already contain bind variables 
(unless your database layer is crap).

Well, indeed, this should not happen, but legacy is never clean :). Anyway, 
there aren't that many cases.


- providing meaningful values for the generic variables
- replaying them against the database from several client machines
- consuming the first record from the responses

Why only the first?

Because we are not interested in the response. Only to check that something 
gets returned. Is this a problem?

Yes it is. I guess the application reads all rows so should you, otherwise you're measuring something else than what the application does.

Imagine you're running
OPTIMIZER_MODE = ALL_ROWS
and stream over the result set containing tens of thousands of rows and stream them into a file. Just looking a the first row isn't going to tell you very much.


- reporting the timing of the statements
- recording the CPU, memory and I/O load of the server

Oracle already provides tools for many of these things.

However, I am interested in pitfalls, and in the way people interpret the 
results given that it is hard to determine what is a typical usage in terms of 
what statements to trigger and at what delays.

The yourself a competent Oracle-DBA and probably sysadmin and storage guy as 
well. No seriously, you wouldn't want to have GemStone benchmarked by someone 
who has never used Smalltak before, would you?

Thanks, we do have a competent Oracle specialist :). But, this being a tricky 
job, I thought of asking around for other experiences.

Yes it is.

Cheers
Philippe


Reply via email to