David Greaves wrote:
Hi
If I specify -s and -c then the resultant file is corrupted if a
resume occurs because the resume sticks the headers partway through
the file.
Additionally, the resume doesn't do a full grab because it miscounts
the size by ignoring the header bytes.
Is this on anyones
Then you haven't looked at enough web sites. Whenever tidydbg (from w3.org)
tells me to do that in one of my URLs, I do that. I've got one page of
links that has tons of them. They work. Can we stop arguing about this
off-topic bit now?
Mark Post
-Original Message-
From: Tony Lewis [
Hi,
I am working with a Windows XP machine.
I tried to use the Wget 1.9.1.
But I am unable to download the file.
I was behind a proxy server+firewall. Is that restricting me OR any problem?
The commands I used are
1. Wget yahoo.com
2. wget --proxy=on http_proxy=.:80/ http://rediff.com
There ar