Re: unexpected _child(lose) for a detached session from _start
2009/11/20 Rocco Caputo rcap...@pobox.com Agreed, a _child(lose) before the _child(create) is bad. This a case of _child(lose) without any _child(create) ever (neither before or after). It should either be a _child(create)/_child(lose) pair, or nothing in this case. I'm tempted to go with nothing since it would be hard to fix the create/lose timing. I'm tempted to go with nothing, as the session is declared detached. Also the new(detached = 1) seems good, but I admit I haven't given it much though yet. Sometimes I change my mind after thinking about things too hard. If you haven't already, please submit this to bug-...@rt.cpan.org. I'm liable to forget about POE bugs, patches, tests, etc. if they're not in POE's bug tracker. See bug #51772. Thank you again for all your help. POE is helping me much, so thanks to you. It helped me to build my application during the last 9 months. However, debugging is painful and I'm still tracking a particular bug in my code for 3 weeks now. I just discovered POE::API::Peek and it looks like it is the toolbox I was looking for for months. Olivier.
Strategy in designing a scaleable robot with POE::Component::Client::HTTP?
Hello, Assume I only have a dual core server, with limited 1GB memory , I want to build a web robot to crawl 1000 pre-defined web sites. Anyone can provide a basic strategy for my tasks? Should I create 1000 sessions at the same time, to archive the max network throughput? Thanks.
Re: Strategy in designing a scaleable robot with POE::Component::Client::HTTP?
Find out how many sites your system and network will let you crawl at once. Limit the number of parallel jobs to that. Read about tuning POE::Component::Client::HTTP, either in the documentation or in this mailing list's archives. Stay under your system's limits. Consider how performance plummets when a machine overcommits its memory and begins swapping. Don't let that happen to you. Use fork() with POE to take advantage of both cores. Are you looking for a design consultant? -- Rocco Caputo - rcap...@pobox.com On Nov 20, 2009, at 23:15, Ryan Chan wrote: Hello, Assume I only have a dual core server, with limited 1GB memory , I want to build a web robot to crawl 1000 pre-defined web sites. Anyone can provide a basic strategy for my tasks? Should I create 1000 sessions at the same time, to archive the max network throughput? Thanks.