Hrvoje Niksic ha scritto:
Noèl Köthe [EMAIL PROTECTED] writes:
a wget -c problem report with the 1.11 alpha 1 version
(http://bugs.debian.org/378691):
I can reproduce the problem. If I have already 1 MB downloaded wget -c
doesn't continue. Instead it starts to download again:
Mauro, you
Mauro Tortonesi [EMAIL PROTECTED] writes:
you're right, of course. the patch included in attachment should fix
the problem. since the new HTTP code supports Content-Disposition
and delays the decision of the destination filename until it
receives the response header, the best solution i could
Hrvoje Niksic ha scritto:
Mauro Tortonesi [EMAIL PROTECTED] writes:
you're right, of course. the patch included in attachment should fix
the problem. since the new HTTP code supports Content-Disposition
and delays the decision of the destination filename until it
receives the response header,
Thx for the program first off. This might be a big help for
me.
What Im trying to do is pull .aspx pages off of a companies
website as .html files and save them locally. I also need the images and css to
be converted for local also.
I cant figure out the proper command to do this. Also
try wget -r -np http://www.example.com/press/release.aspx
and then write a script to change all the .aspx files
extension to .html (you can't get the server side code BTW, only the
html that is generated)
Ranjit Sandhu
SRA
From: Savage, Ken [mailto:[EMAIL PROTECTED]
Sent: Thursday, August
Hi folks,
Sorry if this is a FAQ, but I did search around and couldn't find an
easy answer
I'm using the windows version 1.10-2b
When using the command line:
wget --mirror --convert-links --html-extension site
I get all of the files and directory structure but when I try to browse
the files