On 10/07/2010 12:01 AM, Ondrej Ivanič wrote:
> Hi,
> 
> Part of the database benchark (Postgres 8.4, recent JDBC4 Postgresql
> Driver) is query which fetches 1000 recent events (row size is ~1k so
> 1mb of data in total):
> 
> select * from event order by datetime desc limit 1000
> 
> When query is executed using JDBC sampler it takes between 2 and 5
> minutes to finish comparing to less than 50ms using psql or simple PHP
> script. I suspec that value of JDBC fetch size parameter is set too
> low. Is it possible to change this parameter?
> 
> I found the following workaround:
> explain analyze select * from event order by datetime desc limit 1000
> which executes query (analyze = carry out the command and show the
> actual run times) on the server and query result is discarded. The
> question is if test is still reliable...

Hi,

this question reminds me of discussions we had here about whether images
and static content should be included in web tests or not.

The answers were "yes", "no", "maybe" and "it depends", in no particular
order or prioritization.

A test that has the server discard all response data will probably
accurately model the CPU load as induced by the queries. But if the
result sets are especially large, the network usage normally required
for transfering the results possibly shouldn't be neglected and may be a
dominating factor in overall system performance (in "real life" that is).

Bottom line is, you may have to make an educated guess about the
reliability of such a test. Hopefully a database person will speak up,
I'm talking more from a web perspective here.

Cheers,
Felix

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to