Hello, Everyone
I am running wget from SVN and I have come upon a problem that I have
never had before.  I promise that I checked the documentation on your
website to see if I needed to change how I use wget.  I even joined
this list, and perused the archives, with no results.  Not that they
aren't there, but I didn't find any :)
I will use my own domain as an example:
In the past, I would run:
wget -kr -nc http://www.afolkey2.net
and the result would be a mirror of my domain, with the links
converted for local viewing. (In this case, wget is the SVN version,
which is located at /usr/local/bin/wget)

Now, if I run that same command, I get the following output:

[EMAIL PROTECTED] Archives]$ wget -kr -nc http://www.afolkey2.net
--07:55:48--  http://www.afolkey2.net/

Resolving www.afolkey2.net... 12.203.241.111
Connecting to www.afolkey2.net|12.203.241.111|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1339 (1.3K) [text/html]

100%[======================================================================>] 
1,339       --.-K/s   in 0.01s

07:55:48 (136 KB/s) - `www.afolkey2.net/index.html' saved [1339/1339]

FINISHED --07:55:48--
Downloaded: 1 files, 1.3K in 0.01s (136 KB/s)
[EMAIL PROTECTED] Archives]$  

As you can see, it downloaded ONLY http://www.afolkey2.net/index.html
and exited without error.

If I try adding a sub-directory to the above example, the result is the
same - wget downloads index.html in the directory that I point it to,
and then exits without error.

But, if I run:
"/usr/bin/wget -kr -nc http://www.afolkey2.net"; (/usr/bin/wget is the
version of wget that ships with the distro that I run, Fedora Core 3)
the result is as it should be, www.afolkey2.net is downloaded in it's
entirety, and the links are converted for local viewing.

I understand that maybe something has changed about wget's options.
But I was not able to locate that information on my own.
If this is a bug (Fedora Core 3 specific or not) I would be glad to
report it as soon as you tell me that it is a bug.
If you need any more information, let me know and I will be glad to
oblige.
Right now, I'm going to uninstall /usr/local/bin/wget and reinstall
from a fresh download from SVN and see what happens.....

Have a Great Day,
Steven P. Ulrick

P.S.: For clarification, I only used afolkey2.net as an example.  Every
website that I attempt wget -kr -nc on behaves the same way.  But I
JUST discovered that recursive downloading from ftp domains seems to
work perfectly.  I am now downloading ftp.crosswire.org, and it looks
like it would happily continue until there was no more to download.

Reply via email to