Well, since you are using a -np it shouldn't be very difficult adding a -l0
.
Or, look at the structure of the pages and try to understand why wget
doesn't download anymore.
Or, run wget with -d -a wget.log and check the logfile afterwards in order
to understand EXACTLY why wget stopped, but you'll have to wade through a
lot of output.
Heiko

-- 
-- PREVINET S.p.A. www.previnet.it
-- Heiko Herold [EMAIL PROTECTED]
-- +39-041-5907073 ph
-- +39-041-5907472 fax

> -----Original Message-----
> From: Kyle Lundstedt [mailto:[EMAIL PROTECTED]
> Sent: Monday, June 16, 2003 2:11 AM
> To: [EMAIL PROTECTED]
> Subject: Downloading from a Site, not a Subdirectory
> 
> 
> Hi,
>     I'm trying to mirror a site which includes a large number of PDF
> files.  If I use the following command:
>  
> wget -m -A pdf -np http://faculty.haas.berkeley.edu/wallace/
>  
> I'm able to obtain all of the PDF files in the "wallace" directory and
> its subdirectories.
> However, I'm interested in obtaining all of the PDF files on the
> faculty.haas.berkeley.edu site.
> When I use the following command:
>  
> wget -m -A pdf -np http://faculty.haas.berkeley.edu/
>  
> I don't get any PDF files at all.  Anyone know how I can make 
> this work?
>  
> Thanks,
> Kyle
> 

Reply via email to