Hi all,

I wrote a php script which is running very long queries (hours) on a
postgresql database.
I seem to have a problem to run the code when there are single queries
which take long times (like 5 hours for an update query), from the log
of the database I received the following code:

2005-09-30 17:12:13 IDT postgres : LOG:  00000: duration: 18730038.678
ms  statement: UPDATE product_temp SET nleft=(SELECT
2005-09-30 17:12:13 IDT postgres : LOCATION:  exec_simple_query,
postgres.c:1035
2005-09-30 17:12:13 IDT postgres : LOG:  08006: could not send data to
client: Broken pipe
2005-09-30 17:12:13 IDT postgres : LOCATION:  internal_flush, pqcomm.c:1050
2005-09-30 17:12:13 IDT postgres : LOG:  08P01: unexpected EOF on client
connection
2005-09-30 17:12:13 IDT postgres : LOCATION:  SocketBackend, postgres.c:287
2005-09-30 17:12:13 IDT postgres : LOG:  00000: disconnection: session
time: 6:04:58.52
2005-09-30 17:12:13 IDT postgres : LOCATION:  log_disconnections,
postgres.c:3403

Now after the 5 hours update it need to echo into a log file a line
which say that it ended the current command (just for me to know the times), my
assumption is that PHP read the code into memory at start and opened the
connection to the file, after a time which he waited to any given "life
sign" he gave up and closed the connection to the file, and when the
code came back to the file it encountered no connection to the file
(broken pipe).

Or a newer assumption... though I did set max_execution_time to 0 (endlessly) 
it does close itself up after waiting to much time for the database response, 
and when the database do respond the php client is closed already and I receive 
the broken pipe error.

Am I correct at my assumption? if so how can I set the PHP to wait how
much I tell him?
Ofcourse if im wrong I would like to know the reason also :)


Thanks in advance,
  Ben-Nes Yonatan

Reply via email to