> wget does not download all directories at a URL unless I tack on a
> sub-directory.

   Wget and I do not know what "all directories at a URL" means.  And,
between my weak psychic powers and a description like "[url]", I can't
tell if Wget is working properly or not.

   A URL tells Wget which protocol to use, which computer to ask, and
what to ask for.  Exactly what Wget does depends on the protocol (or
"scheme", like, say, FTP or HTTP) which the URL specifies.  When using
FTP, with "-r", Wget may get a directory listing, and then it can fetch
files recursively.  When using HTTP, Wget fetches a Web page, and, with
"-r", it follows links in the Web page which a Web server returns,  If
that Web page does not include links to the files of interest, then Wget
probably won't find/fetch them.

   If you want a detailed description of what Wget is doing, add "-d" to
your command.  If you want a detailed explanation of why Wget does not
work as expected with "[url]", then you may need to provide some basic
information about "[url]".

------------------------------------------------------------------------

   Steven M. Schweda               sms@antinode-info
   382 South Warwick Street        (+1) 651-699-9818
   Saint Paul  MN  55105-2547

Reply via email to