php-general Digest 1 Jun 2009 02:54:13 -0000 Issue 6152

Topics (messages 293399 through 293403):

spawning a process that uses pipes - doesn't terminate when webpage download is 
canceled
        293399 by: flint
        293402 by: bruce

Re: mysql_query takes long time...
        293400 by: flint
        293401 by: Phpster

Directing form to different handlers?
        293403 by: Angus Mann

Administrivia:

To subscribe to the digest, e-mail:
        [email protected]

To unsubscribe from the digest, e-mail:
        [email protected]

To post to the list, e-mail:
        [email protected]


----------------------------------------------------------------------
--- Begin Message --- sent this before, don't know if it went through... someone please reply if it went, even if they don't know answer?...

so here's the scenario..

I have a site that uses php with a database to offer sound files to
users using streaming methods.

the request page has options, allowing the user to modify the sound
file in various ways, before having it sent to them

Here's the problem:

The method i'm using to feed the data to the user is to run the source
file through various piped commands, with the resulting audio being
dumped to stdout, and then using passthru in php to get that data to
the enduser.

here's an example, for serving an MP3 with its pitch/speed changed by sox:

passthru("lame --quiet --decode \"" . $in_file . "\" - | " .
            "sox -V -S -t wav - -t wav - speed " . $speed_factor . " | " .
            "lame --quiet " . $lame_params . " - -");

This works just fine, except the problem is if the end user aborts the
transfer (e.g. stops playback in the media player, cancels download of
the mp3, whatever) then it leaves behind both the sox process and the
decoder LAMe process along with the sh that's running them. the only
process that exits is the final encoding lame process. If the sound
file runs to completion, everythign exits properly.

But this obviously means enough "cancelling" of downloads means the
server ends up with a huge batch of stuck processes! And I even tried
simply killing the 'host' sh process, and the lame and sox processes
remain anyway. The only way I've been able to deal with this is
manually killing the lame and sox processes directly.

is there any way I can make this work, such so that if the user
cancels the transfer, all relavent processes are killed rather than
just the single process that's feeding output into php?

-FM


--- End Message ---
--- Begin Message ---
we answered this a number of times... 

was there something in the replies that didn't satisfy you?



-----Original Message-----
From: flint [mailto:[email protected]]
Sent: Sunday, May 31, 2009 6:53 AM
To: PHP-General List
Subject: [PHP] spawning a process that uses pipes - doesn't terminate
when webpage download is canceled


sent this before, don't know if it went through... someone please reply if 
it went, even if they don't know answer?...

so here's the scenario..

I have a site that uses php with a database to offer sound files to
users using streaming methods.

the request page has options, allowing the user to modify the sound
file in various ways, before having it sent to them

Here's the problem:

The method i'm using to feed the data to the user is to run the source
file through various piped commands, with the resulting audio being
dumped to stdout, and then using passthru in php to get that data to
the enduser.

here's an example, for serving an MP3 with its pitch/speed changed by sox:

passthru("lame --quiet --decode \"" . $in_file . "\" - | " .
             "sox -V -S -t wav - -t wav - speed " . $speed_factor . " | " .
             "lame --quiet " . $lame_params . " - -");

This works just fine, except the problem is if the end user aborts the
transfer (e.g. stops playback in the media player, cancels download of
the mp3, whatever) then it leaves behind both the sox process and the
decoder LAMe process along with the sh that's running them. the only
process that exits is the final encoding lame process. If the sound
file runs to completion, everythign exits properly.

But this obviously means enough "cancelling" of downloads means the
server ends up with a huge batch of stuck processes! And I even tried
simply killing the 'host' sh process, and the lame and sox processes
remain anyway. The only way I've been able to deal with this is
manually killing the lame and sox processes directly.

is there any way I can make this work, such so that if the user
cancels the transfer, all relavent processes are killed rather than
just the single process that's feeding output into php?

-FM


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


--- End Message ---
--- Begin Message ---

----- Original Message ----- From: "flint" <[email protected]>
To: "דניאל דנון" <[email protected]>
Sent: Sunday, May 31, 2009 9:21 AM
Subject: Re: [PHP] mysql_query takes long time...


have you actually tried running the script against your data?

i'm running MySQL/PHP on an old P3/933Mhz box with only 384M memory... I had a situation just like yours. I had a text file of 600,000 lines, the script reads a line, does regex matches/replacements on it, and inserts to SQL. it took the script about 15 mins to process the entire file and put all the data into MySQL.

and remember that's on a P3/933 so a faster system will obviously speed that up significantly

don't worry about the time MySQL reports for an individual query. it is dependent on many factors. but once you open your SQL connection and start streaming data in it will actually go much faster than that. just run the script and give it a try... even add some debugging output, somethign like this maybe:

$counter = 0;
while (!feof($fhandle)) {
 $line = fgets($fhandle);
 process_line($line);
 insert_into_sql($line);
 $counter++;
 if ($counter % 100 = 0) { echo $counter . " records done\n"; }
}

that will give you a printout every 100 records processed... a nice way to track progress.

FM

----- Original Message ----- From: "דניאל דנון" <[email protected]>
To: "PHP General List" <[email protected]>
Sent: Sunday, May 31, 2009 7:18 AM
Subject: [PHP] mysql_query takes long time...


I've a file of about 500,000 lines, each line contains a string in variety of lengths, but no less then 3 characters and usually no more then.... 120.
average of about 80, and maximum of about 250.

I made a PHP script to fetch the data (using fgets), process it and insert
it to a MySQL database.

The problem is inserting to MySQL takes about 0.02 seconds, which looks like
nothing - but when you have 500,000 lines to insert...
The while goes like that:

fgets from file
x1 = some function about the string
x2 = some other function about the string
x3 = the string
insert into table (field1, field2, field3) VALUES (x1, x2, x3)

(pseudo-code)

I was wondering, is there any faster way to perform it, assuming I have to
do it with PHP?
also, if it matters - the MySQL table got id in auto increment.


Yours, Daniel.

--
Use ROT26 for best security





--- End Message ---
--- Begin Message ---
You can also stack the queries to run multiple rows in one insert

Insert into table values (row1col1, row1col2,'row1col3'), (row2col1,row2col2,'row2col3'),...(rowNcol1,rowNcol2,'rowNcol3')

Bastien

Sent from my iPod

On May 31, 2009, at 8:18, דניאל דנון <[email protected]> wrote:

I've a file of about 500,000 lines, each line contains a string in variety of lengths, but no less then 3 characters and usually no more then.... 120.
average of about 80, and maximum of about 250.

I made a PHP script to fetch the data (using fgets), process it and insert
it to a MySQL database.

The problem is inserting to MySQL takes about 0.02 seconds, which looks like
nothing - but when you have 500,000 lines to insert...
The while goes like that:

fgets from file
x1 = some function about the string
x2 = some other function about the string
x3 = the string
insert into table (field1, field2, field3) VALUES (x1, x2, x3)

(pseudo-code)

I was wondering, is there any faster way to perform it, assuming I have to
do it with PHP?
also, if it matters - the MySQL table got id in auto increment.


Yours, Daniel.

--
Use ROT26 for best security

--- End Message ---
--- Begin Message ---
Hi all. I realize this is more an HTML question than PHP but I'm sure someone 
here can help.

I have several forms with lots (dozens) of text inputs. If the user presses the 
"Update" button I want the form handled by "update.php" but if they press 
"Delete" it needs to be handled by "delete.php" or "add.php" and so-on 
depending on the button they press.

But when establishing the form I can only have <form method="POST" 
action="delete.php"> or "add.php" or whatever.

Is there a way to direct the content of the form do a different handler 
depending on the button?

I know I can use javascript to direct to a constructed URL and append 
?name=smith&address=hishouse&telephone=28376.....and so on but this is not 
practical when there are dozens of entries....the URL becomes massive. I prefer 
to use "POST" and then use PHP to extract the POST array.

Any ideas?

Much appreciated.
Angus

--- End Message ---

Reply via email to