I am using CURL to check pages outside my site on the internet for some
text, then I email someone on my database who supposedly looks after the
site. The script creates a process that does not terminate. It is running
on a scheduled cronjob so eventually my site has over the maximum of 20
processes running and it goes offline.
It may be that some of the pages I check are hard to get a response from, as
CURL tries to return the target URL. I have removed any emailing that it
does and still get the problem. Is there any way to check that a process is
not terminating after a reasonable time?