[Bug-wget] How to download all the links on a webpage which are in some directory?

2011-08-01 Thread Peng Yu
Suppose I want download  www.xxx.org/somefile/aaa.sfx and the links
therein (but restricted to the directory www.xxx.org/somefile/aaa/)

I tried the option  '--mirror -I /somefile/aaa', but it only download
www.xxx.org/somefile/aaa.sfx. I'm wondering what is the correct option
to do so?

-- 
Regards,
Peng



Re: [Bug-wget] How to download all the links on a webpage which are in some directory?

2011-08-01 Thread Giuseppe Scrivano
Peng Yu pengyu...@gmail.com writes:

 Suppose I want download  www.xxx.org/somefile/aaa.sfx and the links
 therein (but restricted to the directory www.xxx.org/somefile/aaa/)

 I tried the option  '--mirror -I /somefile/aaa', but it only download
 www.xxx.org/somefile/aaa.sfx. I'm wondering what is the correct option
 to do so?

it looks like the right command.  Can you check using -d what is going
wrong?

Cheers,
Giuseppe