On Sunday, June 29, 2014 4:19:27 PM UTC+5:30, subhaba...@gmail.com wrote:
Dear Group,
I am trying to crawl multiple URLs. As they are coming I want to write them
as string, as they are coming, preferably in a queue.
If any one of the esteemed members of the group may kindly help.
On Mon, 30 Jun 2014 12:23:08 -0700, subhabangalore wrote:
Thank you for your kind suggestion. But I am not being able to sort out,
fp = open( scraped/body{:05d}.htm.format( n ), w )
please suggest.
look up the python manual for string.format() and open() functions.
The line indicated opens
On 29/06/2014 11:49, subhabangal...@gmail.com wrote:
Dear Group,
I am trying to crawl multiple URLs. As they are coming I want to write them as
string, as they are coming, preferably in a queue.
If any one of the esteemed members of the group may kindly help.
Regards,
Subhabrata Banerjee.
In article mailman.11325.1404048700.18130.python-l...@python.org,
Dave Angel da...@davea.name wrote:
subhabangal...@gmail.com Wrote in message:
Dear Group,
I am trying to crawl multiple URLs. As they are coming I want to write them
as string, as they are coming, preferably in a queue.
On Sunday, June 29, 2014 7:31:37 PM UTC+5:30, Roy Smith wrote:
In article mailman.11325.1404048700.18130.python-l...@python.org,
Dave Angel da...@davea.name wrote:
subhabangal...@gmail.com Wrote in message:
Dear Group,
I am trying to crawl multiple URLs. As they are
On Sun, 29 Jun 2014 10:32:00 -0700, subhabangalore wrote:
I am opening multiple URLs with urllib.open, now one Url has huge html
source files, like that each one has. As these files are read I am
trying to concatenate them and put in one txt file as string.
From this big txt file I am trying