cool1...@gmail.com writes:
> Here are some scripts, how do I put them together to create the script
> I want? (to search a online document and download all the links in it)
> p.s: can I set a destination folder for the downloads?
You can use os.chdir to go to the desired folder.
>
> urllib.urlope
On Mon, Aug 5, 2013 at 8:58 AM, Michael Torrie wrote:
> On 08/02/2013 03:46 AM, cool1...@gmail.com wrote:
> > I do know some Python programming, I just dont know enough to put
> > together the various scripts I need...I would really really
> > appreciate if some one can help me with that...
>
Hi
On 08/02/2013 03:46 AM, cool1...@gmail.com wrote:
> I do know some Python programming, I just dont know enough to put
> together the various scripts I need...I would really really
> appreciate if some one can help me with that...
Seems like your first task, then, is to become proficient at python
On Sun, Aug 4, 2013 at 4:57 PM, wrote:
> I understand I did not ask the question correctly, but is there any chance
> you can help me put together this code? I know that you all do this for fun
> and enjoy it and that is why I asked you guys instead of asking some one who
> will charge me for
I understand I did not ask the question correctly, but is there any chance you
can help me put together this code? I know that you all do this for fun and
enjoy it and that is why I asked you guys instead of asking some one who will
charge me for a very simple line of code.
I would appreciate it
On Fri, Aug 2, 2013 at 10:46 AM, wrote:
> I do know some Python programming, I just dont know enough to put together
> the various scripts I need...I would really really appreciate if some one can
> help me with that...
Be aware that you might be paying money for that. If you know "some"
carpe
I do know some Python programming, I just dont know enough to put together the
various scripts I need...I would really really appreciate if some one can help
me with that...
--
http://mail.python.org/mailman/listinfo/python-list
Am 01.08.2013 18:02, schrieb cool1...@gmail.com:
I know I should be testing out the script myself but I did, I tried
and since I am new in python and I work for a security firm that ask
me to scan hundreds of documents a day for unsafe links (by opening
them) I thought writing a script will be mu
I know I should be testing out the script myself but I did, I tried and since I
am new in python and I work for a security firm that ask me to scan hundreds of
documents a day for unsafe links (by opening them) I thought writing a script
will be much easier. I do not know how to combine those
On Thu, 01 Aug 2013 10:57:01 +1000, alex23 wrote:
> On 31/07/2013 6:15 PM, cool1...@gmail.com wrote:
>> Here are some scripts, how do I put them together to create the script
>> I want? (to search a online document and download all the links in it)
>
> 1. Think about the requirements.
> 2. Write
On 31/07/2013 6:15 PM, cool1...@gmail.com wrote:
Here are some scripts, how do I put them together to create the script I want?
(to search a online document and download all the links in it)
1. Think about the requirements.
2. Write some code.
3. Test it.
4. Repeat until requirements are met.
Here are some scripts, how do I put them together to create the script I want?
(to search a online document and download all the links in it)
p.s: can I set a destination folder for the downloads?
urllib.urlopen("http://";)
possible_urls = re.findall(r'\S+:\S+', text)
import urllib2
respons
On 30 July 2013 22:47, Cameron Simpson wrote:
> On 30Jul2013 09:12, cool1...@gmail.com wrote:
> | ** urlib, urlib2
>
> Sure. And I'd use BeautifulSoup to do the parse. You'll need to fetch that.
> So: urllib[2] to fetch the document and BS to parse it for links,
> then urllib[2] to fetch the lin
On Tue, 30 Jul 2013 07:49:04 -0700, cool1574 wrote:
> Hello, I am looking for a script that will be able to search an online
> document (by giving the script the URL) and find all the downloadable
> links in the document and then download them automatically.
> I appreciate your help,
Why use Pyth
On 30Jul2013 09:12, cool1...@gmail.com wrote:
| ** urlib, urlib2
Sure. And I'd use BeautifulSoup to do the parse. You'll need to fetch that.
So: urllib[2] to fetch the document and BS to parse it for links,
then urllib[2] to fetch the links you want.
http://www.crummy.com/software/BeautifulSoup/
Le 30/07/2013 18:10, cool1...@gmail.com a écrit :
What if I want to use only Python? is that possible? using lib and lib2?
Have a look here:
http://bazaar.launchpad.net/~vincent-vandevyvre/qarte/trunk/view/head:/parsers.py
This script get a web page and parse it to find downloadable objects.
What if I want to use only Python? is that possible? using lib and lib2?
--
http://mail.python.org/mailman/listinfo/python-list
On Tue, Jul 30, 2013 at 5:10 PM, wrote:
> What if I want to use only Python? is that possible? using lib and lib2?
> --
> http://mail.python.org/mailman/listinfo/python-list
Sure, anything's possible. And a lot easier if you quote context in
your posts. But why do it? wget is exactly what you ne
Am 30.07.2013 16:49, schrieb cool1...@gmail.com:
Hello, I am looking for a script that will be able to search an
online document (by giving the script the URL) and find all the
downloadable links in the document and then download them
automatically.
Well, that's actually pretty simple. Using th
** urlib, urlib2
--
http://mail.python.org/mailman/listinfo/python-list
On Tue, Jul 30, 2013 at 4:49 PM, wrote:
> I know but I think using Python in this situation is good...is that the full
> script?
That script just drops out to the system and lets wget do it. So don't
bother with it.
ChrisA
--
http://mail.python.org/mailman/listinfo/python-list
I know but I think using Python in this situation is good...is that the full
script?
--
http://mail.python.org/mailman/listinfo/python-list
On Tue, Jul 30, 2013 at 3:49 PM, wrote:
> Hello, I am looking for a script that will be able to search an online
> document (by giving the script the URL) and find all the downloadable links
> in the document and then download them automatically.
> I appreciate your help,
> Thank you.
baseurl
23 matches
Mail list logo