On 15/01/2010 11:52, Peter J. Holzer wrote:
[quoting rearranged into sensible order]
On 2010-01-15 17:15:04 +0800, Agarwal, Gopal K wrote:
Aplogies for the ambiguous question below. I'll explain it further.
I am connecting the Oracle DB with perl DBI. For short queries
(execution time <2 sec) I am able to fetch the data but for long
queries (>7 sec) my perl scripts hangs.
I am writing the perl program which will analyze the Data Base
performance across different vendors. The amount of data is in orders of
Giga Bytes.
I want to select the data from table and want to see the number of
rows,data volume and time in fetching the data.
Here is the code snippet for this:
my $sth = $dbh->prepare($query)
or die "Couldn't prepare statement: " . $dbh->errstr;
$sth->execute();
$ref = $sth->fetchall_arrayref;
print "\nNumber of rows:", 0 + @{$ref};
It is taking too much time to retrieve the data which is in order of >25
million records.
There is no timeout in DBI, and, unless you have done something peculiar
to your database, there isn't any in Oracle, either. I often have
queries which are running for several minutes and sometimes even for
several hours.
If you are retrieving 25 million records with fetchall_arrayref, the
problem is much more likely to be memory instead of time. Even if the
rows are very short we are talking about approximately 3 GB of RAM here,
if each row has lots of fields it can be a lot more. Unless you need all
that data in memory at the same time, use fetchrow_arrayref in a loop
instead.
hp
Alternatively, time it for a few records and then increase the required
number of rows (perhaps use 'and rownum < /n/') gradually, logging the
time for each run and number of records. That will convince you it's
working ok and also you'll be able to better gauge how long 25 million
rows will take. As Peter says, it's more likely to be memory rather
than anything else, but the best approach would depend on exactly what
you are trying to retrieve and what your architecture looks like.
--
cheers
Martin