On Tue, Feb 28, 2012 at 10:26 PM, Marcelo Zamperetti <[email protected]>wrote:

> I was thinking of showing the results to the user after all searches
> have been made,
> with maybe a timeout of 1 or 2 seconds for each individual search, so
> the whole
> process would never take more than 2 seconds, even if some results are
> lost eventually.
> This way the user will see everything ordered right away, instead of
> results
> 'slipping' down with AJAX or whatever.
> Anyway, you talked about the 'new scheduler'. Is it covered in the
> book?
> I saw the experimental scheduler, but I don't get how that would fit
> my problem. I'm
> not going to schedule the tasks, it will be done on demand.
> More to the point, I just need to run
>
> x = urllib2.urlopen(bla)
> x.read()
>
> many times, all at the same time, without one waiting for another.
> Just that.
>

The problem is the "just that" doen't scale. This application is supposed
to be small ? If so, just use threads and will work.

But, if you want to have 100 users opening each one 20 threads, it's
clear that there is a limitation, isn't it?

[],


>
> Thanks.
>
>
> On Feb 28, 7:52 am, Luciano Pacheco <[email protected]> wrote:
> > How you are thing to show this to user ?
> >
> > Case 1:
> > User request a search => you (web2py) dispatch all the 5 (up to 20)
> > sub-external-searches, *only* after finishing all 5 - 20
> > sub-external-searches, send the response for the user.
> >
> > Case 2:
> > User request a search => you (web2py) put all the 5 - 20
> > sub-external-searches in a background tasks, send a response to the user
> > like "Your search is being performed " (you can use javascript to polling
> > the server. And show the final result after the background tasks have
> > finshed.
> >
> > In case 1 in any language you cannot have a response in a reasonable
> time,
> > because external access always have chance to have problem, in your
> network
> > in other 5 to 20 networks.
> >
> > The case 2, I think, you can use the new web2py scheduler.
> >
> > In the case 2 and the scheduler seems to be a good idea to use threads,
> > even with Python and its GIL. :-)
> >
> > [],
> >
> > On Tue, Feb 28, 2012 at 6:05 PM, Marcelo Zamperetti <[email protected]
> >wrote:
> >
> >
> >
> >
> >
> >
> >
> >
> >
> > > I know a bit of programming but still new to python and web2py (and to
> > > web programming in general, actually), so sorry if I make a gaffe and
> > > for my poor english.
> > > I have the intention of developing a meta-search website. It queries a
> > > bunch of other sites for results, processes them  and show to the
> > > user.
> > > I need to query all the other sites at the same time (I'll start with
> > > 5 but intend to go up to around 15 or 20), if I do one after another I
> > > won't be able to respond to the user in a reasonable time. I found on
> > > the book about background tasks, but none of the approaches seemed to
> > > fit my problem. They are about running tasks at pre-determined times
> > > or for running other applications. I just need something as simple as
> > > threads, but I've read somewhere that I can't invoke threads like a
> > > normal python application on the controller, and from what I've
> > > understood it is in the controller that my code will run.
> > > I'm probably missing something very simple, but I would be grateful if
> > > someone could point me in the right way (even if the right is another
> > > web framework or even another language).
> > > Thank you very much.
> >
> > --
> > Luciano Pacheco
> > blog.lucmult.com.br
>



-- 
Luciano Pacheco
blog.lucmult.com.br

Reply via email to