On Thu, Mar 5, 2009 at 4:14 PM, Chris <dmag...@gmail.com> wrote:
> Firstly always cc the mailing list so others can add their own suggestions.
> Also please don't put your reply at the top, it makes it very hard to follow
> what's going on. Put it underneath or inline (put comments after mine and
> put more later on).
> ????? ???? wrote:
>> Several problems....
>> First, I don't have cron jobs either (Using OnlineCronJobs.com which
>> limits me in the number of cron jobs). As I said, I am running the script
>> every 8 hours.
> Does your host not support cron jobs? Find another host - or find another
> cron provider that lets you run more frequently. There are others around.
>> If I'll "delete" the row from the db after *each* execution, then its 100
>> queries per page excluding the queries that already exists - a lot of
> No, it's not. A table with 100 rows is nothing, it's tiny. It takes longer
> for you to read this than it does for a db to process 100 rows.
I wrote a scraping program that ran from a shared server at one point.
To get around the execution time limit (since I was at the mercy of
connection speeds to the page being scraped) I had the script process
X records, then forward the user's browser to the same script with
parameters to instruct it to process the next X records.
This was done in PHP browser mode, of course, and not CLI. I called
the script using a scheduled task I had setup on my desktop PC that
used cURL to kick the whole thing off.
This is far and away one of the more ridiculous loops I've had to make
in order to get around server limitations... but it worked. Anyway,
Chris is right--100 rows of non-derived data is child's play for a
RDBMS to churn out.
If your server supports shell scripting (although I doubt it, if
they're not letting you do cron jobs and they have safe_mode on) you
could probably accomplish all of this with the mysql command-line
Just rambling at this point. Sorry. :D
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php