`--no-clobber' is very usfull option, but i retrive document not only with
.html/.htm suffix.
Make addition option that like -A/-R define all allowed/rejected rules
for -nc option.
To subscribe to this list, please send mail to
[EMAIL PROTECTED].
There is currently no way to disable following redirects. A patch to
do so has been submitted recently, but I didn't see a good reason why
one would need it, so I didn't add the option. Your mail is a good
argument, but I don't know how prevalent that behavior is.
What is it with servers that
Hi Sergey!
-nc does not only apply to .htm(l) files.
All files are considered.
At least in all wget versions I know of.
I cannot comment on your suggestion, to restrict -nc to a
user-specified list of file types.
I personally don't need it, but I could imagine certain situations
were this
From: Gisle Vanem [mailto:[EMAIL PROTECTED]
Jens Rösner [EMAIL PROTECTED] said:
...
I assume Heiko didn't notice it because he doesn't have that function
in his kernel32.dll. Heiko and Hrvoje, will you correct this ASAP?
--gv
Probably.
Currently I'm compiling and testing on NT 4.0
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
This might be one cause for compilation breakage in html-parse.c.
It's a Gcc-ism/c99-ism/c++-ism, depending on how you look at it, fixed
by this patch:
2003-10-03 Hrvoje Niksic [EMAIL PROTECTED]
* html-parse.c (convert_and_copy):
Hello Everyone,
I am new to this wget utility, so pardon my ignorance.. Here is a brief explanation of
what I am currently doing:
1). I go to our customer's website every day log in using a User Name Password.
2). I click on 3 links before I get to the page I want.
3). I right-click on the
I'm a big fan of wget. I've been usnig it for quite a while now, and am now
testing the 1.9beta3 on win2k.
First of all, I'd like to suggest a couple of things:
# it should be possible to tell wget to ignore a couple of errors:
FTPLOGINC // FTPs often give out this error when they're full. I
Suhas Tembe wrote:
1). I go to our customer's website every day log in using a User Name
Password.
[snip]
4). I save the source to a file subsequently perform various tasks on
that file.
What I would like to do is automate this process of obtaining the source
of a page using wget. Is
Tony Lewis [EMAIL PROTECTED] writes:
wget
http://www.custsite.com/some/page.html --http-user=USER --http-passwd=PASS
If you supply your user ID and password via a web form, it will be
tricky (if not impossible) because wget doesn't POST forms (unless
someone added that option while I wasn't
Suhas Tembe [EMAIL PROTECTED] writes:
Hello Everyone,
I am new to this wget utility, so pardon my ignorance.. Here is a
brief explanation of what I am currently doing:
1). I go to our customer's website every day log in using a User Name Password.
2). I click on 3 links before I get to
Several bugs fixed since beta3, including a fatal one on Windows.
Includes a working Windows implementation of run_with_timeout.
Get it from:
http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta4.tar.gz
-q and -S are incompatible and should perhaps produce errors and be noted thus
in the docs.
BTW, there seems no way to get the -S output, but no progress
indicator. -nv, -q kill them both.
P.S. one shouldn't have to confirm each bug submission. Once should be enough.
13 matches
Mail list logo