php-general Digest 13 Sep 2011 07:35:55 -0000 Issue 7475

Topics (messages 314774 through 314779):

Re: PHP cron job optimization
        314774 by: Igor Escobar
        314775 by: Eric Butera
        314777 by: Igor Escobar

Stop PHP execution on client connection closed
        314776 by: Marco Lanzotti
        314778 by: Al
        314779 by: Marco Lanzotti

Administrivia:

To subscribe to the digest, e-mail:
        php-general-digest-subscr...@lists.php.net

To unsubscribe from the digest, e-mail:
        php-general-digest-unsubscr...@lists.php.net

To post to the list, e-mail:
        php-gene...@lists.php.net


----------------------------------------------------------------------
--- Begin Message ---
Other good point is: always set a timeout connection when you're getting the
RSS data to avoid your thread get stuck unnecessary. Use cURL (is much more
faster then file_get_contents).

Multithreading in PHP with cURL http://devzone.zend.com/article/3341


Regards,
Igor Escobar
*Software Engineer
*
+ http://blog.igorescobar.com
+ http://www.igorescobar.com
+ @igorescobar <http://www.twitter.com/igorescobar>





On Mon, Sep 12, 2011 at 10:05 AM, Igor Escobar <titiolin...@gmail.com>wrote:

> Use PHP threads. Do the job separately.. in parts... in other words... you
> can't read all them at once.
>
> You can read a little more about php multithreading here:
> http://blog.motane.lu/2009/01/02/multithreading-in-php/
>
> You can use a non-relational database like mongo or couchdb to manage where
> you stop and where you have to look back to the RSS feed as well.
>
> []'s
>
> Regards,
> Igor Escobar
> *Software Engineer
> *
> + http://blog.igorescobar.com
> + http://www.igorescobar.com
> + @igorescobar <http://www.twitter.com/igorescobar>
>
>
>
>
>
>
> On Sat, Sep 10, 2011 at 10:37 PM, Stuart Dallas <stu...@3ft9.com> wrote:
>
>> On 10 Sep 2011, at 09:35, muad shibani wrote:
>>
>> > I want to design an application that reads news from RSS sources.
>> > I have about 1000 RSS feed to collect from.
>> >
>> > I also will use Cron jobs every 15 minutes to collect the data.
>> > the question is: Is there a clever way to collect all those feed items
>> > without exhausting the server
>> > any Ideas
>>
>> I designed a job queuing system a while back when I had a similar problem.
>> You can read about it here: http://stut.net/2009/05/29/php-job-queue/.
>> Set that type of system up and add a job for each feed, set to run every 15
>> minutes. You can then watch the server and tune the number of concurrent job
>> processors so you get the optimum balance between load and speed.
>>
>> -Stuart
>>
>> --
>> Stuart Dallas
>> 3ft9 Ltd
>> http://3ft9.com/
>> --
>> PHP General Mailing List (http://www.php.net/)
>> To unsubscribe, visit: http://www.php.net/unsub.php
>>
>>
>

--- End Message ---
--- Begin Message ---
On Mon, Sep 12, 2011 at 9:37 AM, Igor Escobar <titiolin...@gmail.com> wrote:
> Other good point is: always set a timeout connection when you're getting the
> RSS data to avoid your thread get stuck unnecessary. Use cURL (is much more
> faster then file_get_contents).
>
> Multithreading in PHP with cURL http://devzone.zend.com/article/3341
>
>
> Regards,
> Igor Escobar
> *Software Engineer
> *
> + http://blog.igorescobar.com
> + http://www.igorescobar.com
> + @igorescobar <http://www.twitter.com/igorescobar>
>
>
>
>
>
> On Mon, Sep 12, 2011 at 10:05 AM, Igor Escobar <titiolin...@gmail.com>wrote:
>
>> Use PHP threads. Do the job separately.. in parts... in other words... you
>> can't read all them at once.
>>
>> You can read a little more about php multithreading here:
>> http://blog.motane.lu/2009/01/02/multithreading-in-php/
>>
>> You can use a non-relational database like mongo or couchdb to manage where
>> you stop and where you have to look back to the RSS feed as well.
>>
>> []'s
>>
>> Regards,
>> Igor Escobar
>> *Software Engineer
>> *
>> + http://blog.igorescobar.com
>> + http://www.igorescobar.com
>> + @igorescobar <http://www.twitter.com/igorescobar>
>>
>>
>>
>>
>>
>>
>> On Sat, Sep 10, 2011 at 10:37 PM, Stuart Dallas <stu...@3ft9.com> wrote:
>>
>>> On 10 Sep 2011, at 09:35, muad shibani wrote:
>>>
>>> > I want to design an application that reads news from RSS sources.
>>> > I have about 1000 RSS feed to collect from.
>>> >
>>> > I also will use Cron jobs every 15 minutes to collect the data.
>>> > the question is: Is there a clever way to collect all those feed items
>>> > without exhausting the server
>>> > any Ideas
>>>
>>> I designed a job queuing system a while back when I had a similar problem.
>>> You can read about it here: http://stut.net/2009/05/29/php-job-queue/.
>>> Set that type of system up and add a job for each feed, set to run every 15
>>> minutes. You can then watch the server and tune the number of concurrent job
>>> processors so you get the optimum balance between load and speed.
>>>
>>> -Stuart
>>>
>>> --
>>> Stuart Dallas
>>> 3ft9 Ltd
>>> http://3ft9.com/
>>> --
>>> PHP General Mailing List (http://www.php.net/)
>>> To unsubscribe, visit: http://www.php.net/unsub.php
>>>
>>>
>>
>

Thread != Multi Process.

--- End Message ---
--- Begin Message ---
@Eric ok ;)


Regards,
Igor Escobar
*Software Engineer
*
+ http://blog.igorescobar.com
+ http://www.igorescobar.com
+ @igorescobar <http://www.twitter.com/igorescobar>





On Mon, Sep 12, 2011 at 10:52 AM, Eric Butera <eric.but...@gmail.com> wrote:

> On Mon, Sep 12, 2011 at 9:37 AM, Igor Escobar <titiolin...@gmail.com>
> wrote:
> > Other good point is: always set a timeout connection when you're getting
> the
> > RSS data to avoid your thread get stuck unnecessary. Use cURL (is much
> more
> > faster then file_get_contents).
> >
> > Multithreading in PHP with cURL http://devzone.zend.com/article/3341
> >
> >
> > Regards,
> > Igor Escobar
> > *Software Engineer
> > *
> > + http://blog.igorescobar.com
> > + http://www.igorescobar.com
> > + @igorescobar <http://www.twitter.com/igorescobar>
> >
> >
> >
> >
> >
> > On Mon, Sep 12, 2011 at 10:05 AM, Igor Escobar <titiolin...@gmail.com
> >wrote:
> >
> >> Use PHP threads. Do the job separately.. in parts... in other words...
> you
> >> can't read all them at once.
> >>
> >> You can read a little more about php multithreading here:
> >> http://blog.motane.lu/2009/01/02/multithreading-in-php/
> >>
> >> You can use a non-relational database like mongo or couchdb to manage
> where
> >> you stop and where you have to look back to the RSS feed as well.
> >>
> >> []'s
> >>
> >> Regards,
> >> Igor Escobar
> >> *Software Engineer
> >> *
> >> + http://blog.igorescobar.com
> >> + http://www.igorescobar.com
> >> + @igorescobar <http://www.twitter.com/igorescobar>
> >>
> >>
> >>
> >>
> >>
> >>
> >> On Sat, Sep 10, 2011 at 10:37 PM, Stuart Dallas <stu...@3ft9.com>
> wrote:
> >>
> >>> On 10 Sep 2011, at 09:35, muad shibani wrote:
> >>>
> >>> > I want to design an application that reads news from RSS sources.
> >>> > I have about 1000 RSS feed to collect from.
> >>> >
> >>> > I also will use Cron jobs every 15 minutes to collect the data.
> >>> > the question is: Is there a clever way to collect all those feed
> items
> >>> > without exhausting the server
> >>> > any Ideas
> >>>
> >>> I designed a job queuing system a while back when I had a similar
> problem.
> >>> You can read about it here: http://stut.net/2009/05/29/php-job-queue/.
> >>> Set that type of system up and add a job for each feed, set to run
> every 15
> >>> minutes. You can then watch the server and tune the number of
> concurrent job
> >>> processors so you get the optimum balance between load and speed.
> >>>
> >>> -Stuart
> >>>
> >>> --
> >>> Stuart Dallas
> >>> 3ft9 Ltd
> >>> http://3ft9.com/
> >>> --
> >>> PHP General Mailing List (http://www.php.net/)
> >>> To unsubscribe, visit: http://www.php.net/unsub.php
> >>>
> >>>
> >>
> >
>
> Thread != Multi Process.
>

--- End Message ---
--- Begin Message ---
Hi all, I'm new in the list and I already have a question for you.
I'm running an heavy query on my DB in a PHP script called by AJAX.
Because client often abort AJAX connection to ask a new query, I need to
stop query because DB will be too loaded.
When AJAX connection is aborted, PHP script doesn't stop until it send
some output to client, so I need to wait query execution to know client
aborted connection.
How can I abort query (or script) when AJAX connection is aborted?

Thank you,
Marco


--- End Message ---
--- Begin Message ---
See http://us2.php.net/manual/en/function.connection-aborted.php

On 9/12/2011 10:40 AM, Marco Lanzotti wrote:
Hi all, I'm new in the list and I already have a question for you.
I'm running an heavy query on my DB in a PHP script called by AJAX.
Because client often abort AJAX connection to ask a new query, I need to
stop query because DB will be too loaded.
When AJAX connection is aborted, PHP script doesn't stop until it send
some output to client, so I need to wait query execution to know client
aborted connection.
How can I abort query (or script) when AJAX connection is aborted?

Thank you,
Marco


--- End Message ---
--- Begin Message ---
Il 12/09/2011 21:32, Al ha scritto:
> See http://us2.php.net/manual/en/function.connection-aborted.php

As I wrote, PHP doesn't detect that client aborted connection until it
send some data.
During query the script doesn't send any data to client, so it doesn't
detect client aborted connenction.
I know this function, but it's useless to solve my problem...

Bye,
Marco

--- End Message ---

Reply via email to