Hi all:
First, thanks for the multiple suggestions. I'm pretty new at PHP
programming, so all suggestions are great learning opportunities.
Now, some observations: I've tried placing the "ignore_user_abort(TRUE);"
in the code. It seems to have made little, if any, impact -- the page still
appears to time out. I've also tried placing "set_time_limit(0);" both before
and after the "ignore_user_abort(TRUE);" call. Still no improvement.
I'm now wondering if some error is occurring that, for some reason, is
silently ending the routine. I'm building what may be a very long SQL INSERT
statement for each line in the CSV file that I'm reading; could I be hitting
some upper limit for the length of the SQL code? I'd think that an error would
be presented in this case, but maybe I have to do something explicitly to force
all errors to display? Even warnings?
Another thing I've noticed is that the "timeout" (I'm not even certain the
problem IS a timeout any longer, hence the quotation marks) doesn't happen at
the same record every time. That's why I thought it was a timeout problem at
first, and assumed that the varying load on the server would account for the
different record numbers processed. If I were hitting some problem with the
SQL statement, I'd expect it to stop at the same record every time. Or is that
misguided thinking, too?
More info: I've activated a session (have to be logged into the application
to update the data), so is it possible that something with the session module
could be causing the problem? I have adjusted the session.gc_maxlifetime value
(per an example I saw in the PHP manual comments elsewhere), but is there some
other value I should adjust, too?
Thanks for all of your continued help and assistance! And please don't
argue over "best" ways to solve my problem -- ALL suggestions are welcome!
(Even if I don't know thing one about CLI or even how to access it. <g>)
Jon