Thanks for everyone's replies - but I'm a little uncertain as to what the
reasons for not running the pcntl functions under the web browser are - is
it down to security?

cheers

-----Original Message-----
From: Richard Lynch [mailto:[EMAIL PROTECTED] 
Sent: 18 April 2006 22:52
To: James Nunnerley
Cc: php-general@lists.php.net
Subject: Re: [PHP] Forking a search - pcntl

On Tue, April 18, 2006 5:21 am, James Nunnerley wrote:
> What we'd like to do is run the actual search query in the background
> (i.e.
> fork) while the viewable page shows a nice scrollie banner etc!

fork is not the only solution for this, thank [insert deity here]

> Due to various problems with the server we are using (Zeus) we can't
> just
> run an exec().  I've had a look around, and it would seem that pcntl
> maybe
> the way forward.

NO WAY!

pcntl should NOT be run in a web-hosted environment, according to the
docs.

I don't think Zeus' threading model is going to make it "okay" to use.

> Does anyone have an example working script, or indeed a decent
> tutorial on
> how to use the functionality?  The php manual has limited information
> on
> using the functions, and I just can't get my head around how it's
> meant to
> work?!!!

pctnl should only be used from CLI/CGI

The examples should be sufficient for that, but you don't really care,
as it's not useful for what you want.

Here is what *I* would recommend:

Create a new table of "searches":
create table search(
  search_id int(11) auto_increment unique not null primary key,
  status enum{0, 1, 2} default 0,
  search_pid int(11) default null,
  inputs text,
  results text
);

0 = new search
1 = in progress
2 = complete

Now, when somebody wants you to do a search, just insert a record:
$query = "insert into search(inputs) values('$CLEAN[inputs]')";
where $CLEAN is your sanitized validated $_REQUEST input, of course.
$id = mysql_insert_id();

Your scrolling banner can than have a META Refresh of a few
seconds/minutes/whatever, and embed the $id into the URL for that.
See: http://php.net/mysql_insert_id

Then, of course, you need something to actually PERFORM the searches,
which is where a nice cron job comes in.

That cron job can start a new search task which will do this:
[in psuedo-code]

$pid = getmypid(); // or something like that:
UPDATE search set status = 1, search_pid = $pid where status = 0 LIMIT 1
SELECT id, inputs from search where search_pid = $pid
$id = $row['id'];
update search set status = 1 where id = $id
//do the search
//when done:
update search set status = 2, results = '$search_results' where id = $id

Doing it this way means you could even run several processes at once,
each working on a different search.

Note that the UPDATE marks the record as "in progress" and ties it to
the process running, so that there is NO race condition.

If MySQL does not support LIMIT 1 in an UPDATE, which I'm pretty sure
it does, but not 100% certain, then you'd have to just update all the
inputs available, and have the thread handle each search that it
"took" in the UPDATE.

You could still have an army of search processes, though, as new
search inputs are coming in all the time, and each search process
would handle however many were on the "To Do" (status 0) stack when
they started.

This is very scalable, and you could even buy more computers and throw
them at it, with just one database, provided you tagged the process_id
with a machine_id of some kind as well, to avoid duplciate process IDs
on 2 computers.  I'll leave that part as an exercise.

-- 
Like Music?
http://l-i-e.com/artists.htm

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to