Hello Wire,

Try;

wget -nd -r -l1 --no-parent
http://www.fsi-language-courses.com/Courses/French/Basic%20(Revised)/Volume%201/

This works fine, but sometimes it gives some silly [EMAIL PROTECTED] files
that wish I do not want. To make your directory clean, if you knows
the file format you want, you can do this:

wget -nd -r -l1 --no-parent -A.mp3
http://www.fsi-language-courses.com/Courses/French/Basic%20(Revised)/Volume%201/

-nd no directory, by default wget creates a dir
-r recursively download
-l1 (L one) level 1, download only of that particular folder, don't go
depth on it.
–no-parent I definately don't want the parent's files


-- 
Patrick Patons Ocira

you can predict that some actions will be calculated to be unpredictable

_______________________________________________________
(c) 2008 Copyright OCIRA.net - Creative Applications. All Rights Reserved.

tel: +256 704 008818 | Kampala, Uganda

------------------------------------------------------------------------------------------------------------------------------------






On Sat, Oct 18, 2008 at 12:25 AM, Wire James <[EMAIL PROTECTED]> wrote:
> Thanks Gerald. I decided to download one at a time. Will bear with that for
> now.
>
> Wire
>
> On Fri, 2008-10-17 at 19:14 +0300, Gerald Begumisa wrote:
>
>> I want to download a series of files using wget from the command line.
>> The actual link is as below but like you know, Linux will not recognise
>> the spaces among other things.
>>
>> wget http://www.fsi-language-courses.com/Courses/French/Basic
>> (Revised)/Volume 1/FSI - French Basic Course (Revised) - Volume 1 - Unit
>> 01 1.3.mp3
>> 1. How can I get to make the correct request using wget?
>
> Try enclosing the URL in double quotes e.g
>
> wget "http://www.fsi-language-courses.com/Courses/French/Basic
> (Revised)/Volume 1/FSI - French Basic Course (Revised) - Volume 1 - Unit 01
> 1.3.mp3"
>
>> 2. There are numerous files in the same directory whose difference is
>> the figure just before .mp3 at the end e.g Unit 01 1.3.mp3, Unit 01
>> 1.4.mp3, Unit 01 1.5.mp3, Unit 02 1.1.mp3, Unit 02 1.2.mp3 etc. Any one
>> with a quick script I could use?
>
> If these are the only files in that directory then you could try wget's
> --recursive option to pull everything in the directory down...
>
> _______________________________________________
> LUG mailing list
> [email protected]
> http://kym.net/mailman/listinfo/lug
> %LUG is generously hosted by INFOCOM http://www.infocom.co.ug/
>
> The above comments and data are owned by whoever posted them (including
> attachments if any). The List's Host is not responsible for them in any way.
> ---------------------------------------
>
>
> _______________________________________________
> LUG mailing list
> [email protected]
> http://kym.net/mailman/listinfo/lug
> %LUG is generously hosted by INFOCOM http://www.infocom.co.ug/
>
> The above comments and data are owned by whoever posted them (including
> attachments if any). The List's Host is not responsible for them in any way.
> ---------------------------------------
>
>
>
_______________________________________________
LUG mailing list
[email protected]
http://kym.net/mailman/listinfo/lug
%LUG is generously hosted by INFOCOM http://www.infocom.co.ug/

The above comments and data are owned by whoever posted them (including 
attachments if any). The List's Host is not responsible for them in any way.
---------------------------------------

Reply via email to