Thanks for the responses.
I would love to be able to output the data to a flat file or something and do
bulk inserts daily, however this isn't really feasible for my situation because
the database operations depend on other records.
Basically the system I am creating works like this
The first bottleneck is open/close for each record, which are time consuming
operation. Why don't you just use one connection?
The second one is do the condition check in script instead of doing all within
database server.
Try to fix this two issues. If it is still too slow, we can do more
Tiger,
I didn't mean that I am opening closing the database connection for each
record. I only have 1 database connection which is opened at the beginning of
the perl script and closed when (if ever) the script is killed.
The open and close I was referring to was for the actual log messages.
What do yo mean about the condition check in a script? Not sure I follow.
On 09/21/2011 10:41 AM, tiger peng wrote:
The first bottleneck is open/close for each record, which are time consuming
operation. Why don't you just use one connection?
The second one is do the condition check in script
On Wednesday, September 21, 2011 10:04 AM, Brandon Phelps
[mailto:bphe...@gls.com] wrote:
Subject: Re: Tail Module + DBI Module, can\'t keep up!
[...]
4. Perl script from (3) constantly reads in the /var/log files using
the Tail module. It does the following:
a. When a connection is
Here's a way that might speed things up for you considerably by
eliminating a DB hit.
I'm assuming that your query only returns 1 row and that you are not
stepping through them all just selecting the last one returned. If
you are stepping through them, modify your select to only return 1 row.
On Sep 21, 2011, at 8:55 AM, Curtis Leach wrote:
Here's a way that might speed things up for you considerably by
eliminating a DB hit.
4. Perl script from (3) constantly reads in the /var/log files using
the Tail module. It does the following:
a. When a connection is opened, it