I took a simple approach to a problem like this on one project, I wrote a
routine to be called for any query the did a Count(*) first, then judged
from the result using system parameter variables if the User was to be
notified ' to many records, change criteria', or to do the actual SQL
requested.  Had a different variable for each table so as to be flexible.
Since one application had a great many tables, wish I had put the
controlling 'max count' values into a table, instead of a configuration PERL
lib.  Would suggest that if you go a route like this, then you can Reset and
change the values easily from an Admin page.  Much easier I would think,
(unless you have to go through several God's to get a table created. ;')

You could capture the PID, passing it to another PERL child process which
with a timer, check on and killed the PID passed.  Non-Web PERL stuff, which
worked pretty well, but can get messy.




Clay Stewart - "Software development for 26 years and counting . . ."
Sr. Web Applications Developer

UUNET, a Worldcom Company
-------------------------------------->>>
Ashburn Campus
F1-3-534
703.886.6577
email:  [EMAIL PROTECTED]



-----Original Message-----
From: Bill McClintock [mailto:[EMAIL PROTECTED]]
Sent: Thursday, January 10, 2002 1:32 PM
To: [EMAIL PROTECTED]
Subject: Cancelling HUGE sql queries to ORACLE from PERL abort???


Here's the situation...I have a web-based application on a SUN system w/ 
Apache that allows the user to fill out a search criteria form and 
submit it back to the server to perform SQL queries, via PERL(CGI) to an 
Oracle DB located on the same Sun server.

CLIENT(ANY OS) <----> WEBSERVER/ORACLE DB(SUN)

Upon submissions of the form to the webserver a PERL(CGI) script starts 
a SQL query.  Well these queries are very large and can take hours to 
complete.  So the user, on the CLIENT end (browser), cancels the 
operation after a period of time and then tries to refine their search 
and tries again.  Imagine this happening several times...

In the meantime the previous PERL(CGI) processes are still running, 
presumably at the DBI->execute() command, and will run until the DBI 
command completes then the script terminates if there is no place for 
the results to go.  Some scripts dump to a file so the script then 
creates a file and dumps the results although there is no need for the data.

What I am looking for, if possible, is to cancel the DBI call when the 
user hits cancel on the client machine.

The server is getting bogged down with orphaned PERL(CGI) that still 
runs on the server and causes Oracle to run jobs that the client has 
aborted for some time.

I have heard talk about using a signal handler/trap to catch abort 
signals but I was wondering if anyone has experienced this and/or has a 
solution.

NOTE: I have also seen this type of behavior on Win32 platforms running 
ActiveState Perl.

Any advice is greatly appreciated...
-- 
Bill McClintock - EDMS App/Web Development
    MCIWorldcom - Colorado Springs (GOG)
    VNet:622-0054   Local:(719)265-0054
      Pager:1-800-PAGEMCI PIN#1686599
      EMail:[EMAIL PROTECTED]
               AOLIM:bm0054

Reply via email to