On Feb 22, 4:00pm, Dan Harkless wrote:
>
> > In my application, I wanted to apply the pattern to the *entire* URL,
> > inclusive of intervening directories. I made a small modification to
> > the source code to implement full-URL pattern matching:
> >
> > *** utils.c.origSun Jun 25 23:
"Daniel Lafraia" <[EMAIL PROTECTED]> writes:
> Hello,
>
> I'm using wget with a simple perl script to browse some FTP directories
> (I don't want to use Net::FTP).
> Actually, I figure out that this function (ftp_get_listings) on ftp.c
> always saves the listing file on .listings which I think
Also, does this:
For each directory files must be retrieved from, Wget will use the
@code{LIST} command to get the listing. It will try to analyze the
listing, assuming that it is a Unix @code{ls -l} listing, and extract
the time-stamps. The rest is exactly the same as for @sc{
Jan (or anyone), is this paragraph in 1.7-dev's wget.texi:
You may have to quote the @sc{url} to protect it from being expanded by
your shell. Globbing makes Wget look for a directory listing, which is
system-specific. This is why it currently works only with Unix @sc{ftp}
serv
Hello!
Does the timeout option -T work ok in wget 1.6
?
Best Regards
-Johannes Harju-
I thinking wget could have a status option, like bash's
$ set -o
allexport off
braceexpand on
errexit off...
perhaps a plain
$ wget -d
might be a good place.
--
http://www.geocities.com/jidanni Tel886-4-25854780