Tony Esposito wrote:

Please try to fix your email application (Yahoo) to wrap lines at ~76
chars per line ;)

> I am trying to retrieve all columns and some (not all) rows from an Oracle 
> table which contain - among other things - 2 CLOB columns and print this to a 
> flat file (ASCII file).  
> 
> As I "tune" $dbh->{LongReadLen} = 20000 I can varying results from 'Out of 
> Memory!" errors to not all data retrieved to a Perl Interpreter Error Window, 
> which asks me if I want to send the error contents to Microsoft.  

It seems you are setting a global variable for something that you should
limit within a smaller scope.

Again, without knowing what your statement is, it's hard to say, but if
I were running into memory issues, I'd rebuild my SQL query as such:

my $sth = ("
            SELECT *
            FROM table_name
            LIMIT 5000
        ") or die $dbh->errstr;

imho, global catchalls should be used as a *very last* resort to trap
problems that will cause fatal OS errors.

By using the 'LIMIT' statement, you can get what you want, write it to a
file, de-allocate the memory, get the row id that you left off at, and
then start over again by grabbing another 'limit' number of rows and
appending that chunk to the file.

If you need further help with doing this, at least reply with what your
auto-increment is called, and what your full statement is.

I'll help you write a function to 'chunk' things up.

There are two possible outcomes to that:

- I respond with code for you and it works or it doesn't

- someone else responds with code for you that works

and

- the code that you are responded with is reviewed and criticized
heavily, and what falls out the bottom of the thread is much better and
more intriguing than you could have imagined ;)

Steve


-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to