RE: [PHP] spawning a process that uses pipes - doesn't terminate when webpage download is canceled

2009-05-31 Thread bruce
we answered this a number of times... 

was there something in the replies that didn't satisfy you?



-Original Message-
From: flint [mailto:fmill...@gmail.com]
Sent: Sunday, May 31, 2009 6:53 AM
To: PHP-General List
Subject: [PHP] spawning a process that uses pipes - doesn't terminate
when webpage download is canceled


sent this before, don't know if it went through... someone please reply if 
it went, even if they don't know answer?...

so here's the scenario..

I have a site that uses php with a database to offer sound files to
users using streaming methods.

the request page has options, allowing the user to modify the sound
file in various ways, before having it sent to them

Here's the problem:

The method i'm using to feed the data to the user is to run the source
file through various piped commands, with the resulting audio being
dumped to stdout, and then using passthru in php to get that data to
the enduser.

here's an example, for serving an MP3 with its pitch/speed changed by sox:

passthru(lame --quiet --decode \ . $in_file . \ - |  .
 sox -V -S -t wav - -t wav - speed  . $speed_factor .  |  .
 lame --quiet  . $lame_params .  - -);

This works just fine, except the problem is if the end user aborts the
transfer (e.g. stops playback in the media player, cancels download of
the mp3, whatever) then it leaves behind both the sox process and the
decoder LAMe process along with the sh that's running them. the only
process that exits is the final encoding lame process. If the sound
file runs to completion, everythign exits properly.

But this obviously means enough cancelling of downloads means the
server ends up with a huge batch of stuck processes! And I even tried
simply killing the 'host' sh process, and the lame and sox processes
remain anyway. The only way I've been able to deal with this is
manually killing the lame and sox processes directly.

is there any way I can make this work, such so that if the user
cancels the transfer, all relavent processes are killed rather than
just the single process that's feeding output into php?

-FM


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] spawning a process that uses pipes - doesn't terminate when webpage download is canceled

2009-05-31 Thread Robert Cummings
On Sun, 2009-05-31 at 08:52 -0500, flint wrote:
 sent this before, don't know if it went through... someone please reply if 
 it went, even if they don't know answer?...
 
 so here's the scenario..
 
 I have a site that uses php with a database to offer sound files to
 users using streaming methods.
 
 the request page has options, allowing the user to modify the sound
 file in various ways, before having it sent to them
 
 Here's the problem:
 
 The method i'm using to feed the data to the user is to run the source
 file through various piped commands, with the resulting audio being
 dumped to stdout, and then using passthru in php to get that data to
 the enduser.
 
 here's an example, for serving an MP3 with its pitch/speed changed by sox:
 
 passthru(lame --quiet --decode \ . $in_file . \ - |  .
  sox -V -S -t wav - -t wav - speed  . $speed_factor .  |  .
  lame --quiet  . $lame_params .  - -);
 
 This works just fine, except the problem is if the end user aborts the
 transfer (e.g. stops playback in the media player, cancels download of
 the mp3, whatever) then it leaves behind both the sox process and the
 decoder LAMe process along with the sh that's running them. the only
 process that exits is the final encoding lame process. If the sound
 file runs to completion, everythign exits properly.
 
 But this obviously means enough cancelling of downloads means the
 server ends up with a huge batch of stuck processes! And I even tried
 simply killing the 'host' sh process, and the lame and sox processes
 remain anyway. The only way I've been able to deal with this is
 manually killing the lame and sox processes directly.
 
 is there any way I can make this work, such so that if the user
 cancels the transfer, all relavent processes are killed rather than
 just the single process that's feeding output into php?

Use something else to pass the data back to the user... popen() comes to
mind or proc_open(). Then disable auto abort on user disconnect via
ignore_user_abort(). Then after sending periodic data chunks, check the
user connection status via connection_aborted(). If your script finds
that the user has aborted, then kill all the processes in the pipeline
from the PHP script before finally aborting the PHP script itself.

Cheers,
Rob.
-- 
http://www.interjinn.com
Application and Templating Framework for PHP


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



[PHP] spawning a process that uses pipes - doesn't terminate when webpage download is canceled

2009-05-28 Thread Flint Million
so here's the scenario..

I have a site that uses php with a database to offer sound files to
users using streaming methods.

the request page has options, allowing the user to modify the sound
file in various ways, before having it sent to them

Here's the problem:

The method i'm using to feed the data to the user is to run the source
file through various piped commands, with the resulting audio being
dumped to stdout, and then using passthru in php to get that data to
the enduser.

here's an example, for serving an MP3 with its pitch/speed changed by sox:

passthru(lame --quiet --decode \ . $in_file . \ - |  .
 sox -V -S -t wav - -t wav - speed  . $speed_factor .  |  .
 lame --quiet  . $lame_params .  - -);

This works just fine, except the problem is if the end user aborts the
transfer (e.g. stops playback in the media player, cancels download of
the mp3, whatever) then it leaves behind both the sox process and the
decoder LAMe process along with the sh that's running them. the only
process that exits is the final encoding lame process. If the sound
file runs to completion, everythign exits properly.

But this obviously means enough cancelling of downloads means the
server ends up with a huge batch of stuck processes! And I even tried
simply killing the 'host' sh process, and the lame and sox processes
remain anyway. The only way I've been able to deal with this is
manually killing the lame and sox processes directly.

is there any way I can make this work, such so that if the user
cancels the transfer, all relavent processes are killed rather than
just the single process that's feeding output into php?

-FM

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] spawning a process that uses pipes - doesn't terminate when webpage download is canceled

2009-05-28 Thread bruce
Hi Flint.

Not sure if you have a solution to this yet, or if I fully understand! But
if your issue is basically that you have a situation where you might have
orphaned processes that never finish and that are consuming real resources
you could have the app get/monitor the process ID for each process you
create, and then simply periodically check to see if that process is in a
run state.

Bow, if the situation is one where the user aborts the transfer process, and
the underlying processes are still 'running' then I would still think the
above appproach would work, but you'd have to have your app keep track of
when the user 'kills' the download process..

But are you sure the orphaned processes are consuming resources, or are they
zombie processes, which are resident in the process tbl, but aren't really
consuming resources... sombie processes will (should) eventually be dealt
with by the operating system...

regards



-Original Message-
From: Flint Million [mailto:fmill...@gmail.com]
Sent: Wednesday, May 27, 2009 11:39 PM
To: php-general@lists.php.net
Subject: [PHP] spawning a process that uses pipes - doesn't terminate
when webpage download is canceled


so here's the scenario..

I have a site that uses php with a database to offer sound files to
users using streaming methods.

the request page has options, allowing the user to modify the sound
file in various ways, before having it sent to them

Here's the problem:

The method i'm using to feed the data to the user is to run the source
file through various piped commands, with the resulting audio being
dumped to stdout, and then using passthru in php to get that data to
the enduser.

here's an example, for serving an MP3 with its pitch/speed changed by sox:

passthru(lame --quiet --decode \ . $in_file . \ - |  .
 sox -V -S -t wav - -t wav - speed  . $speed_factor .  |  .
 lame --quiet  . $lame_params .  - -);

This works just fine, except the problem is if the end user aborts the
transfer (e.g. stops playback in the media player, cancels download of
the mp3, whatever) then it leaves behind both the sox process and the
decoder LAMe process along with the sh that's running them. the only
process that exits is the final encoding lame process. If the sound
file runs to completion, everythign exits properly.

But this obviously means enough cancelling of downloads means the
server ends up with a huge batch of stuck processes! And I even tried
simply killing the 'host' sh process, and the lame and sox processes
remain anyway. The only way I've been able to deal with this is
manually killing the lame and sox processes directly.

is there any way I can make this work, such so that if the user
cancels the transfer, all relavent processes are killed rather than
just the single process that's feeding output into php?

-FM

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php