In article <mailman.11325.1404048700.18130.python-l...@python.org>,
Dave Angel <da...@davea.name> wrote:
> subhabangal...@gmail.com Wrote in message:
> > Dear Group,
> > I am trying to crawl multiple URLs. As they are coming I want to write them
> > as string, as they are coming, preferably in a queue.
> > If any one of the esteemed members of the group may kindly help.
> >From your subject line, it appears you want to keep multiple files open,
> >and write to each in an arbitrary order. That's no problem, up to the
> >operating system limits. Define a class that holds the URL information and
> >for each instance, add an attribute for an output file handle.
> Don't forget to close each file when you're done with the corresponding URL.
One other thing to mention is that if you're doing anything with
fetching URLs from Python, you almost certainly want to be using Kenneth
Reitz's excellent requests module (http://docs.python-requests.org/).
The built-in urllib support in Python works, but requests is so much
simpler to use.