Thanks Gerald. I decided to download one at a time. Will bear with that
for now.

Wire

On Fri, 2008-10-17 at 19:14 +0300, Gerald Begumisa wrote:

> > I want to download a series of files using wget from the command line.
> > The actual link is as below but like you know, Linux will not recognise
> > the spaces among other things.
> >
> > wget http://www.fsi-language-courses.com/Courses/French/Basic
> > (Revised)/Volume 1/FSI - French Basic Course (Revised) - Volume 1 - Unit
> > 01 1.3.mp3
> > 1. How can I get to make the correct request using wget?
> 
> Try enclosing the URL in double quotes e.g
> 
> wget "http://www.fsi-language-courses.com/Courses/French/Basic
> (Revised)/Volume 1/FSI - French Basic Course (Revised) - Volume 1 - Unit 01
> 1.3.mp3"
> 
> > 2. There are numerous files in the same directory whose difference is
> > the figure just before .mp3 at the end e.g Unit 01 1.3.mp3, Unit 01
> > 1.4.mp3, Unit 01 1.5.mp3, Unit 02 1.1.mp3, Unit 02 1.2.mp3 etc. Any one
> > with a quick script I could use?
> 
> If these are the only files in that directory then you could try wget's
> --recursive option to pull everything in the directory down...
> 
> _______________________________________________
> LUG mailing list
> [email protected]
> http://kym.net/mailman/listinfo/lug
> %LUG is generously hosted by INFOCOM http://www.infocom.co.ug/
> 
> The above comments and data are owned by whoever posted them (including 
> attachments if any). The List's Host is not responsible for them in any way.
> ---------------------------------------
> 
_______________________________________________
LUG mailing list
[email protected]
http://kym.net/mailman/listinfo/lug
%LUG is generously hosted by INFOCOM http://www.infocom.co.ug/

The above comments and data are owned by whoever posted them (including 
attachments if any). The List's Host is not responsible for them in any way.
---------------------------------------

Reply via email to