Thank you contacting LRT Marketing.
Please update your address book with our new email address: [EMAIL PROTECTED]
Original Message:
X-Rocket-Spam: 65.245.94.55
X-YahooFilteredBulk: 65.245.94.55
X-Rocket-Track: 1169057: 20 ; SERVER=66.163.174.39
Return-Path: [EMAIL
I know that to download a web page that is a cgi
script (or php or asp or jsp, etc.), you should should
use the --output-document parameter. For example, if
you want to download the following address:
http://www.osnews.com/comment.php?news_id=5602offset=105rows=120
you would simply do something
Is there any way to download the structure (what
files, directories, size of files, etc.) on an FTP
site without downloading the actual files?
Preferably, I would like it to recursively descend
through all the directories so that I can select which
files exactly that I want to download. Thanks
Noname NoLast [EMAIL PROTECTED] writes:
reasonable so that it can save them) otherwise it will
give me error messages such as:
Cannot write to
'http://www.osnews.com/comment.php?news_id=5602offset=105rows=120'
I don't understand this error message. Wget should never try to write
to a
Noname NoLast [EMAIL PROTECTED] writes:
Is there any way to download the structure (what files, directories,
size of files, etc.) on an FTP site without downloading the actual
files?
I think `-R *' should work.
H For getting Wget you might want to link directly to
H ftp://ftp.sunsite.dk/projects/wget/windows/wget-1.9.1b-complete.zip,
OK, but too bad there's no stable second link .../latest.zip so I
don't have to update my web page to follow the link.
Furthermore, they don't need SSL, but I don't see any