Hello,
I have a simple code at: http://pastebin.com/KeU4XANP
This script down a small CSS file from Yahoo CDN for 1000 times, I
benchmarked this script with ApacheBench(ab)
POE/Component::Client::HTTP: 70 requests / sec
ab (ab -n 1000 -c 100
Hi,
I am using the POE-Component-Client-NNTP component to download
newsgroup articles, one of the problem I found is when network
disconnected, the component is hang-ed and never quite the event loop,
so my program wait forever.
I have reported this as bug:
For example, I am using POE::Component::Client::NNTP
http://search.cpan.org/~bingos/POE-Component-Client-NNTP-2.14/lib/POE/Component/Client/NNTP.pm
This module does not specify any timeout, so sometimes my program
seems hanged and does not response.
Is it possible to specify a timeout or
Hi ALL,
On Thu, Mar 4, 2010 at 4:22 AM, p...@0ne.us p...@0ne.us wrote:
profiling. Also, there are numerous Loop adapters ( POE::Loop::IO_Poll,
POE::XS::Loop::EPoll, etc ) that could take advantage of your platform or
workload to reduce the overhead. As always, benchmark/test your code against
I am profiling my POE program which fetch HTML pages from remote site.
The result show that 80% of time are spent on this file POE/Loop/Select.pm
Is it normal and expected behavior?
On Wed, Dec 2, 2009 at 2:56 PM, Nick Perez n...@nickandperla.net wrote:
On Wed, 2 Dec 2009 12:42:34 +0800
Ryan Chan ryanchan...@gmail.com wrote:
If I goes with the fork solution, any abstraction recommended?
I don't want to scare you off, but I can also suggest POEx::WorkerPool
Hello,
Consider my code below would like to execute the sleep() function in
parallel, using POE JobQueue component:
#=
use strict;
use POE qw(Component::JobQueue);
# Passive queue waits for enqueue events.
POE::Component::JobQueue-spawn(
Alias = 'passive',
On Wed, Dec 2, 2009 at 5:31 AM, Rocco Caputo rcap...@pobox.com wrote:
The other general advice is to use fork(), with or without POE, when you
need true parallelism.
It seems POE::Component::Pool::Thread can solve the problem? (But I
was unable to install via CPAN in ubuntu)
If I goes with
Hello,
Assume I only have a dual core server, with limited 1GB memory , I
want to build a web robot to crawl 1000 pre-defined web sites.
Anyone can provide a basic strategy for my tasks?
Should I create 1000 sessions at the same time, to archive the max
network throughput?
Thanks.