On Jul 15, 2010, at 7:36 AM, Yves Chedemois wrote:
Queue D6 backport sounds a good idea - http://drupal.org/project/drupal_queue
Never tested it, though.
(I've overlooked this message.)
I use Drupal Queue in Feeds and Push Hub and it is very reliable.
Check out the README for how to set it
I've used drupal_queue on a project recently and I very much like it.
It's a backport of the Drupal 7 queue system, too, which gives you an
easier upgrade path in the future. It was designed to scale to millions
of queue items, although not withe the default backend plugin. The
default one is
On Jul 15, 2010, at 10:34 AM, Moshe Weitzman wrote:
drush has some built-in support for redispatching (typically used when
when memory gets too low). this is used my migrate module and drush's
own updb command. though others have noted that it can be just as easy
to bail out when you get too lo
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Khalid Baheyeldin schrieb:
> I ran into this before, and implemented a workaround.
>
> No amount of unset() would free memory.
You should try again using php 5.3, the memory management is said to
be much better. This is especially awesome for huge co
drush has some built-in support for redispatching (typically used when
when memory gets too low). this is used my migrate module and drush's
own updb command. though others have noted that it can be just as easy
to bail out when you get too low on memory and call same drush command
on your own.
On
I just ran into this exact problem. I elected to solve it via the D6 backport
of drupal_queue/cron (after trying BatchAPI) and am pretty happy with it.
Joe
On Jul 15, 2010, at 10:24 AM, Alex Barth wrote:
>
> On Jul 15, 2010, at 9:04 AM, Ken Rickard wrote:
>
>> Drush scripts (especially bulk n
On Jul 15, 2010, at 9:04 AM, Ken Rickard wrote:
Drush scripts (especially bulk node processing) are subject to hitting
PHP memory limits when processing large amounts of data.
Anyone have ways around that issue?
Again: batching. Even tasks that are run with drush need to be worked
off in c
I ran into this before, and implemented a workaround.
No amount of unset() would free memory.
The workaround was to ration the number of loop iterations to do via the
script, and then end it and start a new one from where you left off.
while true
do
#php CLI or drush script goes here
NUM=$?
Another option to check is the job_queue module. It takes some changes to
your module, but the payoff is huge.
Check the queue_mail as an example of a module that uses job_queue.
On Thu, Jul 15, 2010 at 7:36 AM, Yves Chedemois wrote:
> Queue D6 backport sounds a good idea -
> http://drupal.org
http://drupal.org/project/job_queue I use it all of the time and it works
great. Sometimes I will make a custom module with a system weight lower
then job_queue and ini_set... in hook_cron to increase a timeout if needed.
Cheers,
Neil
On Thu, Jul 15, 2010 at 9:11 AM, Ken Winters wrote:
> unse
unset() and paging your batch chunks with LIMIT are the standard
strategies if
you wrote the cron yourself.
You might also be able to use ini_set('memory_limit','200M'); but
actually using
less memory is generally preferable.
- Ken Winters
On Jul 15, 2010, at 9:04 AM, Ken Rickard wrote:
On Thu, Jul 15, 2010 at 9:04 AM, Ken Rickard wrote:
> Drush scripts (especially bulk node processing) are subject to hitting
> PHP memory limits when processing large amounts of data.
>
> Anyone have ways around that issue?
Besides ?
andrew
On Thu, Jul 15, 2010 at 7:46 AM, Earnie Boyd
wrote:
> Sven Decabooter wrote:
>>
>> Any pointers as to how I could have large chunks of data processed on
>> cron in another way?
>>
>
> Not by using hook_cron but by creating a separate script that you execute
> within the server cron. The cron proc
Drush scripts (especially bulk node processing) are subject to hitting
PHP memory limits when processing large amounts of data.
Anyone have ways around that issue?
On Thu, Jul 15, 2010 at 8:49 AM, Moshe Weitzman wrote:
> I think drush scripts are your best bet. CLI PHP is not not subject to
> t
I think drush scripts are your best bet. CLI PHP is not not subject to timeout.
On Thu, Jul 15, 2010 at 5:01 AM, Sven Decabooter wrote:
> Hi,
> I'm reading contradicting posts about running Batch API processes on cron.
> This is for Drupal 6 BTW.
> I have tried implementing a batch functionality
Sven Decabooter wrote:
Any pointers as to how I could have large chunks of data processed on
cron in another way?
Not by using hook_cron but by creating a separate script that you
execute within the server cron. The cron process sets a hard 240
seconds to execute hook_cron implementations.
Queue D6 backport sounds a good idea -
http://drupal.org/project/drupal_queue
Never tested it, though.
Yves
Le 15/07/2010 13:28, Sven Decabooter a écrit :
That's clear, and it makes sense. Thanks Yves!
Any pointers as to how I could have large chunks of data processed on
cron in another way?
That's clear, and it makes sense. Thanks Yves!
Any pointers as to how I could have large chunks of data processed on cron
in another way?
Sven
On Thu, Jul 15, 2010 at 1:25 PM, Yves Chedemois wrote:
> Batch API works around the PHP timeout limitation by relying on a client
> browser to iterat
Batch API works around the PHP timeout limitation by relying on a client
browser to iterate separate requests, each of which stays below the time
limitation.
So yes, Batch API can only be used in a UI context, which excludes cron.
For the same reason, it is not recommended to fire a batch proce
Hi,
I'm reading contradicting posts about running Batch API processes on cron.
This is for Drupal 6 BTW.
I have tried implementing a batch functionality that should be run on cron,
but it doesn't seem to process the work that needs to be done.
I assume this is because running the cron through a co
20 matches
Mail list logo