Hello Narendra,

On Sep 9, 12:18 am, narendra sisodiya <[email protected]>
wrote:
> recently I wanted to download 100+ foss seminar pdf/odp files from online.
>
> Step1 :  Create a links.txt : first open a text file and paste all links
> line by line. Every line contain a link to a resource (pdf/odp). save it as
> links.txt
> Step 2: download it using this command
>
> # wget -m -nd -i links.txt
>

you can also use the '-b' or '--background' option to download all the
files simultaneously, *this will consume entire bandwidth

> Two advantage :
> 1. you can issue this command and go to sleep.
> 2. even if your download fails (ex system reboot) you can issue the same
3. can time the downloads, by integrating it with crontab, coolest
feature

> command , It will download only the remaining files.
>
> One Disadvantage : (not exactly, unless you download videos and iso files)
> 1. Ex, if you have 10 links in files. assume script has downloaded 5 files
> and now currently it is downloading 6th file and 80% download has completed.
> If you Ctrl+C and re-run the same command it will start downloading the 6th
> files from beginning. adding -c option will not work.
>
> any suggestion !!

There's one more cURL library, alike wget which outcasts  Wget in many
ways and has not disappointed till date.

cURL Features with Other Download Tools: 
http://curl.haxx.se/docs/comparison-table.html
Download: http://curl.haxx.se/download.html


>
> --
> ┌─────────────────────────┐
> │    Narendra Sisodiya ( नरेन्द्र सिसोदिया )
> │    R&D Engineer
> │    Web :http://narendra.techfandu.org
> │    Twitter :http://tinyurl.com/dz7e4a
> └─────────────────────────┘
--~--~---------~--~----~------------~-------~--~----~
Do you have another question? Click here - 
http://groups.google.com/group/iitdlug/post
l...@iitd community mailing list -- http://groups.google.com/group/iitdlug
-~----------~----~----~----~------~----~------~--~---

Reply via email to