Mark Street [EMAIL PROTECTED] writes:
Many thanks for the explanation and the patch. Yes, this patch
successfully resolves the problem for my particular test case.
Thanks for testing it. It has been applied to the code and will be in
Wget 1.10.1 and later.
Thanks to the effort of Mauro Tortonesi and the prior work of Bruno
Haible, Wget has been modified to no longer use Libtool for linking in
external libraries. If you are interested in why that might be a
cause for celebration, read on.
A bit of history: Libtool was integrated in Wget by Dan
Matthew J Harms [EMAIL PROTECTED] writes:
I'm sure you've already had this suggested, and I don't know if it
will work, due to the complexity of the suggestion, but is there a
way you could implement the capability of wget to download any file
that meets a criteria yet use wildcards (i.e. *
Karsten Hopp [EMAIL PROTECTED] writes:
svn checkout http://svn.dotsrc.org/repo/wget/branches/1.10/ wget-stable
minor issues with that:
[wget-stable] ./autogen.sh
[wget-stable] ./configure --prefix=/usr/
configure: configuring for GNU Wget 1.10.1-beta
Since Wget 1.10 also prints sizes in kilobytes/megabytes/etc., I am
thinking of removing the thousand separators from size display. The
reasons are:
* The separators need to be manually removed when the numbers are
pasted into any software that deals with numbers, such as bc.
This problem
strpbrk is a BSD 4.3 [1] function apparently also mandated by POSIX,
C99, and present on Windows and VMS. Is there a system we care about
that doesn't have it?
[EMAIL PROTECTED] (Larry Jones) writes:
Hrvoje Niksic writes:
strpbrk is a BSD 4.3 [1] function apparently also mandated by POSIX,
C99, and present on Windows and VMS. Is there a system we care about
that doesn't have it?
It was also mandated by C89, so probably not.
Thanks
Oliver Schulze L. [EMAIL PROTECTED] writes:
Looks really nice. Maybe it needs a link to instructions on how to
subscribe to the mailling list.
You can always add it. :-)
But we already have a link to the home page where the information
resides. Link to subscription details probably don't
Does anyone feel that the ChangeLog-branches directories distributed
with Wget are desirable or necessary? These side ChangeLogs are
accumulating and they *repeat* ChangeLog text many times over!
For example, the 1.8, 1.9, and 1.10 branch changelogs have a lot of
overlapping contents. The
Hrvoje Niksic [EMAIL PROTECTED] writes:
Thanks. Is there a copy of C89 (or a close draft) online? It would
be very useful for checking such things.
Oops: typing `c89 draft' into google produces this as the first link:
http://dev.unicals.com/papers/c89-draft.html
During the last couple of weeks I spent some time improving the
wikipedia's page on Wget, ending up with a complete rewrite of the
original, very terse, page. Please let me know how you like it and if
you think it needs corrections or additions.
http://en.wikipedia.org/wiki/Wget
Does anyone use the Watcom compiler or its open-source offspring?
There are some ugly Watcom-specific ifdefs in Wget that we'd be better
off without -- unless someone is actually using it.
Ariel [EMAIL PROTECTED] writes:
Was looking for an option to skip existing files, and after some time
(minutes? hours?) of no luck, i looked at that option -nc Dont
clobber existing files.
clobber == overwrite
http://www.science.uva.nl/~mes/jargon/c/clobber.html
That term has been used so
Mauro Tortonesi [EMAIL PROTECTED] writes:
The new repository is accessible at:
http://svn.dotsrc.org/repo/wget/
For the uninitiated, to checkout the repository, you need a reasonably
recent version of the subversion client and issue something like:
svn checkout
Hrvoje Niksic [EMAIL PROTECTED] writes:
If you want to check out the 1.10 branch (recommended for
distributions because it only contains bug fixes), you can use:
svn checkout http://svn.dotsrc.org/repo/wget/trunk/ wget
Oops! The above should read something like:
svn checkout http
Will Kuhn [EMAIL PROTECTED] writes:
Apparentl wget does not handle single quote or double quote very well.
wget with the following arguments give error.
wget
--user-agent='Mozilla/5.0' --cookies=off --header
'Cookie: testbounce=testing;
Benno Schulenberg [EMAIL PROTECTED] writes:
Ah, but the actual change is in the quoting, like it is used in the
other Invalid boolean message, otherwise use always might be
understood as always use.
You're right. One reason I was wary of adding the quotes was that the
message is already
Gabor Z. Papp [EMAIL PROTECTED] writes:
* Hrvoje Niksic [EMAIL PROTECTED]:
| new configure script coming with wget 1.10 does not honour
| --with-ssl=/path/to/ssl because at linking conftest only
| -I/path/to/ssl/include used, and no -L/path/to/ssl/lib
|
| That is not supposed to happen
Jens Schleusener [EMAIL PROTECTED] writes:
--12:36:51-- http://www.example.com/
= `index.html'
Resolving www.example.com... failed: Invalid flags in hints.
This is really bad. Apparently your version of getaddrinfo is broken
or Wget is using it incorrectly. Can you intuit
Gabor Z. Papp [EMAIL PROTECTED] writes:
* Hrvoje Niksic [EMAIL PROTECTED]:
| According to config.log, it seems your SSL includes are not in
| /pkg/include after all:
Sure, they are in /pkg/include/openssl.
You're right. The Autoconf-generated test is wrong, and I'm trying to
figure out
Jens Schleusener [EMAIL PROTECTED] writes:
The reason for the above error is as already written - at least in
my case using the self compiled libtool version 1.5
I don't think the libtool version used on the system makes any
difference (except for a developer at the point of libtoolizing his
Mauro Tortonesi [EMAIL PROTECTED] writes:
this seems to be already fixed in the 1.10 documentation.
Now that 1.10 is released, we should probably update the on-site
documentation.
Thanks for the detailed report!
Jens Schleusener [EMAIL PROTECTED] writes:
1) Only using the configure-option --disable-nls and the C compiler
gcc 4.0.0 the wget-binary builds successfully
I'd be interested in seeing the error log without --disable-nls and/or
with the system compiler.
This patch should take care of the problems with compiling Wget 1.10
with the native IBM cc.
2005-06-15 Hrvoje Niksic [EMAIL PROTECTED]
* host.h (ip_address): Remove the trailing comma from the type
enum in the no-IPv6 case.
* main.c (struct cmdline_option): Remove
for the report; this patch should fix the problem:
2005-06-15 Hrvoje Niksic [EMAIL PROTECTED]
* ftp-basic.c (ftp_pwd): Handle malformed PWD response.
Index: src/ftp-basic.c
===
RCS file: /pack/anoncvs/wget/src/ftp-basic.c,v
Benno Schulenberg [EMAIL PROTECTED] writes:
A few messages in wget-1.10-rc1 seem to have been overlooked during
gettextization. The patch repairs this.
Thanks a lot for catching these! I'm about to apply them (with the
noted exception below) to the CVS.
It's maybe a bit late to make such a
Herb Schilling hschilling at nasa.gov writes:
When I set the restrict-file-names mode to windows, the filenames and
the links to these files look like this:
photoalbum_photo_view at b_start%3Aint=0
That's a bug, the link should have %25 in place of %. This is fixed
in Wget 1.10,
I am trying to use wget 1.9.1 to download a file using https. The initial
request URL is a script that redirects to another file in the same domain.
If I try this using normal http, it works - the first request returns a
302 response, and wget follows the new URL to download the file. If I try
James Gregory [EMAIL PROTECTED] writes:
I.e., [--header] is truncating some of the header arguments after
commas, and ignoring other header arguments altogether.
Thanks for reporting this. Here is a patch that fixes the problem:
2005-05-30 Hrvoje Niksic [EMAIL PROTECTED]
* init.c
Werner LEMBERG [EMAIL PROTECTED] writes:
directly from the build directory, without using a .wgetrc file. In
the file `screenshots.html' there is a reference to the file
../image/ft2-kde-thumb.png
The reference looks like this:
image width=160 height=120 alt=KDE screenshot
Dan Bolser [EMAIL PROTECTED] writes:
I think --timestamping fails for files 2Gb
Thanks for the report. Wget 1.9.x doesn't support 2+GB files, not
only for timestamping. You can try Wget 1.10-beta from
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-beta1.tar.bz2
Maxim Chervyak [EMAIL PROTECTED] writes:
I tried download some file from some FTP server with WGET. But I
could not. That FTP don't allow listing directories. FTP command
CWD don't work on this server. And this cause error. How can I turn
off this command ?
I'm not sure how Wget is
[EMAIL PROTECTED] writes:
After receiving a large file of about 3GB I received this abort
message.
Wget 1.9.x doesn't support 2+GB files. You can try Wget 1.10-beta
from
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-beta1.tar.bz2 .
Andrew Gargan [EMAIL PROTECTED] writes:
wget ftp://someuser:[EMAIL PROTECTED]@www.somedomain.com/some_file.tgz
is splitting using on the first @ not the second.
Encode the '@' as %40 and this will work. For example:
wget ftp://someuser:[EMAIL PROTECTED]/some_file.tgz
Is this a problem
Does anyone know if it is possible, and how, to control the way
OpenSSL communicates with the remote host? Wget normally precedes
each read() and write() with a select() that enforces the idle timeout
specified by the user using --read-timeout and --timeout.
In SSL it is not enough to select()
Daniel Stenberg [EMAIL PROTECTED] writes:
On Wed, 25 May 2005, Hrvoje Niksic wrote:
In SSL it is not enough to select() before SSL_read because
SSL_read can and does call read multiple times, which breaks the
intended timeout semantics. Is there a way to prevent this?
I figure one way
Mark Anderson [EMAIL PROTECTED] writes:
Is there an option, or could you add one if there isn't, to specify
that I want wget to write the downloaded html file, or whatever, to
stdout so I can pipe it into some filters in a script?
Yes, use `-O -'.
[EMAIL PROTECTED] writes:
I use wget 1.9.1
In IE6.0 page load OK,
but wget return (It's a bug or timeout or ...?)
Thanks for the report. The reported timeout might or might not be
incorrect. Wget 1.9.1 on Windows has a known bug of misrepresenting
error codes (this has been fixed in
Alain Guibert [EMAIL PROTECTED] writes:
On Saturday, May 14, 2005 at 10:09:32 PM +0200, Hrvoje Niksic wrote:
Alain Guibert [EMAIL PROTECTED] writes:
Maybe cmpt.c mktime is failing because of incompatible timezone and
daylight infos on the platform?
If you change __tzset() to tzset
Jim Peterson [EMAIL PROTECTED] writes:
Using Fedora Core 3, when I wget http://www.studylight.org/;, it prints
out:
--02:52:30-- http://www.studylight.org/
= `index.html'
Resolving www.studylight.org... 63.164.18.58
Connecting to www.studylight.org[63.164.18.58]:80...
Seemant Kulleen [EMAIL PROTECTED] writes:
I wanted to alert you all to a bug in wget, reported by one of our
(gentoo) users at:
https://bugs.gentoo.org/show_bug.cgi?id=69827
I am the maintainer for the Gentoo ebuild for wget.
If someone would be willing to look at and help us with that
Seemant Kulleen [EMAIL PROTECTED] writes:
Since I don't use Gentoo, I'll need more details to fix this.
For one, I haven't tried Wget with socks for a while now. Older
versions of Wget supported of --with-socks option, but the procedure
for linking a program with socks changed since then,
Post, Mark K [EMAIL PROTECTED] writes:
I really don't know, but they seem very accommodating to people,
especially Open Source projects such as wget. It's certainly worth
an email to find out. Send your request to help at ibiblio.org.
Unfortunately they told me that they don't host
Alain Guibert [EMAIL PROTECTED] writes:
I find it strange that the cmpt.c mktime produces garbage when that
version is taken from glibc.
Note that with TZ=GMT0 method and #undef HAVE_MKTIME, wget gives
correct timestamps. Maybe cmpt.c mktime is failing because of
incompatible timezone and
Vladimir Bilyov [EMAIL PROTECTED] writes:
unfortunately list on a ftp site doesn't work. wget fails with: No
such directory :-( Is there a way to make wget not to do list?
Wget shouldn't use LIST for downloading a single file, unless you also
specify -N.
Mauro and I are considering the move from CVS to subversion for Wget's
version control. Although switching to subversion is not entirely
uncontroversial, it has advantages that make it great food for
thought.
CVS's network usage is appalling. I have a fairly slow upload link on
my ADSL (the
Doug Kaufman [EMAIL PROTECTED] writes:
That sounds like a good plan. I'll try to make such a change. If
we do call SSL_CTX_set_default_paths, should we document SSL_CERT_*
env variables as you originally suggested?
I think so. I did send a message to the openssl-dev list about this.
Let's
Daniel Stenberg [EMAIL PROTECTED] writes:
There are no license restrictions that prevent you from
using/bundling/include the Mozilla one (if you want to). I have a
little service up and running for those who wants the latest Mozilla
ca cert bundle in PEM format:
Post, Mark K [EMAIL PROTECTED] writes:
You might want to give Ibiblio a try (www.ibiblio.org). They host
my Slack/390 web/FTP site at no cost. They host a _bunch_ of sites
at no cost.
But do they host subversion? I can't find any mention of it with
google.
Joerg Ottermann [EMAIL PROTECTED] writes:
i try to archive some pages using wget, but it seems, that i have some
problems when TE:chunked is used.
The server must not use Transfer-Encoding: chunked in response to an
HTTP/1.0 request. Are you sure that is the problem?
Claudio Fontana [EMAIL PROTECTED] writes:
here's my segfault. It requires some preparation.
[...]
Thanks for the detailed report and the stack trace. This patch should
fix the segmentation fault.
2005-05-10 Hrvoje Niksic [EMAIL PROTECTED]
* res.c (res_register_specs): Correctly
Hrvoje Niksic [EMAIL PROTECTED] writes:
Specifically I am interested in the correctness of the code that
loads the client certificates and checks for server certificates.
Here is the thing we definitely miss: Wget doesn't contain code that
checks the host identity presented by the server's
Daniel Stenberg [EMAIL PROTECTED] writes:
It does require a replica, exact or not.
It's interesting that none of the OpenSSL examples include such code.
In fact, curl may be the single free application that attempts to get
this right!
If you verify a server certificate, you must make sure the
Werner Schmitt [EMAIL PROTECTED] writes:
on machine 2 with wget version: Wget 1.9.1
i get error: not implemented !!
on machine 1 with wget Version: Wget 1.9+cvs-dev
everything is ok
That is because the other machine has a newer (CVS) version of Wget
that correctly implements HTTPS
Alain Guibert [EMAIL PROTECTED] writes:
On Saturday, May 7, 2005 at 6:56:53 PM +0200, Hrvoje Niksic wrote:
I was lead to believe that it is quite reasonable to set tm_isdst to
zero before calling mktime. In fact, the logic in mktime_from_utc,
along with setting tm_isdst to zero
I invite people who are knowledgable about OpenSSL programming to
audit Wget's SSL-related code, located in `openssl.c'.
Specifically I am interested in the correctness of the code that loads
the client certificates and checks for server certificates. While the
basic HTTPS downloads work (and
Jacek Gbal [EMAIL PROTECTED] writes:
I'm trying to get vcf files from a site and it seems that wildcards
doesn't work properly with -A option.
Unfortunately the -A option matches only the file names (the last
directory component of the URL path), not the query string.
Alain Guibert [EMAIL PROTECTED] writes:
No such problem with Woody and Glibc. This problem seems to be half
Wget half libc 5.4.33. In src/http.c:mktime_from_utc() and
http_atotm(), Wget manipulates tm structs forcing tm_isdst to
0. Which is false 7 monthes a year, and hurts libc mktime()
Alain Guibert [EMAIL PROTECTED] writes:
I can now confirm: Alpha3+configure.in patch builds cleanly on Debian Bo
even without --disable-ipv6:
Excellent. Thanks for testing this.
Will Kuhn [EMAIL PROTECTED] writes:
I try to do something like
wget http://website.com/ ...
login=usernamedomain=hotmail%2ecom_lang=EN
But when wget sends the URL out, the hotmail%2ecom
becomes hotmail.com !!! Is this the supposed
behaviour ?
Yes.
I saw this on the sniffer. I suppose
Hrvoje Niksic [EMAIL PROTECTED] writes:
Can I have it not do the translation ??!
Unfortunately, only by changing the source code as described in the
previous mail.
BTW I've just changed the CVS code to not decode the % sequences.
Wget 1.10 will contain the fix.
sure to include the
appropriate patch.
2005-05-07 Hrvoje Niksic [EMAIL PROTECTED]
* ftp-basic.c (ftp_request): Prevent newlines in VALUE causing
inadvertent sending of multiple FTP commands.
Index: src/ftp-basic.c
Hrvoje Niksic [EMAIL PROTECTED] writes:
A fix that applies to 1.9.1 follows in a separate mail.
Distributors of Wget will probably want to make sure to include the
appropriate patch.
Here is that fix.
2005-05-07 Hrvoje Niksic [EMAIL PROTECTED]
* ftp-basic.c (ftp_request): Prevent
Are there any systems in use today that don't support gettimeofday?
Wget tests for its availability because at the time I wrote that code
I've seen other programs perform the same test. However, I don't
think I've ever seen a system that didn't support gettimeofday, with
the exception of Windows.
Vitaly Lomov [EMAIL PROTECTED] writes:
Hello
I am trying to get a site http://www.cro.ie/index.asp with the following flags
-r -l2
or
-kr -l2
or
-Er -l2
or
-Ekr -l2
In all cases, the linked files are saved with '@' instead of '?' in
the name, but in the index.asp the link still refers
Vitaly Lomov [EMAIL PROTECTED] writes:
Maybe you're not letting Wget finish the mirroring. The links are
converted only after everything has been downloaded. I've now tried
`wget -Ekrl2 http://www.cro.ie/index.asp --restrict-file-names=windows'
(the last argument being to emulate
Andrzej [EMAIL PROTECTED] writes:
Will the patches be included in the stable 1.10?
Probably. 1.10 is in feature freeze, but this really is a bug fix.
I'd like to check with others if that change is deemed safe for
mirroring of other sites.
Clicking on that link redirects to that page:
Andrzej [EMAIL PROTECTED] writes:
Clicking on that link redirects to that page:
https://lists.man.lodz.pl/mailman/listinfo
and from all the links which are on that page the files are unnecessarily
downloaded (I do not want that page and the subpages).
So how can I block it?
Could
Andrzej [EMAIL PROTECTED] writes:
It's not the end of troubles though!
It works correctly *only* for the first time!
When I (or cron) run the same mirroring commands again over already
mirrored files to renew the mirror, then the correctly converted link of
the gif file (on the main
Andrzej [EMAIL PROTECTED] writes:
[Wget] creates a non existing link:
http://znik.wbc.lublin.pl/Mineraly/Ftp/UpLoad/index.html
[...]
it also created (when the above wget commad was run for the first
time) from the original link to the gif file:
Andrzej [EMAIL PROTECTED] writes:
Yes it works for me as well when I already mirrored it with 1.9.1
version. Only then the two problems I described before
dissapeared. So it was the fault of the old 1.8.1 version.
Many bugs have been fixed from 1.8.1 to 1.9.1. It is always a good
idea to try
Jens Rösner [EMAIL PROTECTED] writes:
Well, if wget has to put index.html is such situations then wget is not
suitable for mirroring such sites,
What exactly do you mean? It seems to work for me, e.g. index.html looks
like the apache-generated directory listing. When mirroring, index.html
Alain Guibert [EMAIL PROTECTED] writes:
Alpha3+ptimer.c patch builds OK, and the binary works. Much thanks
Hrvoje!
[...]
Alpha3 builds OK with --disable-ipv6, and the binary seems to work.
Thanks for testing this.
On Friday, April 29, 2005 at 2:15:55 PM +0200, Hrvoje Niksic wrote
Alan Thomas [EMAIL PROTECTED] writes:
Can I somehow give wget an HTML file's local hard disk location vice
a URL and have it retrieve files at URLs referenced in that HTML
file?
If I understand you correctly, it would be:
wget --force-html -i file
is not constant
| ptimer.c:143: (near initialization for `clocks[0].id')
| make[1]: *** [ptimer.o] Error 1
| make[1]: Leaving directory `/tmp/wget-1.10-alpha3/src'
| make: *** [src] Error 2
Does this this patch fix the problem?
2005-04-29 Hrvoje Niksic [EMAIL PROTECTED]
* ptimer.c (posix_init
Hrvoje Niksic [EMAIL PROTECTED] writes:
| checking for getaddrinfo... no
| configure: Disabling IPv6 support: your system does not support
getaddrinfo(3)
| checking for INET6 protocol support... yes
| checking for struct sockaddr_in6... yes
| checking for struct sockaddr_storage
Herold Heiko [EMAIL PROTECTED] writes:
windows/wget.dep needs an attached patch (change gen_sslfunc to openssl.c,
change gen_sslfunc.h to ssl.h).
Applied, thanks.
src/Makefile.in doesn't contain dependencies for http-ntlm$o
(windows/wget.dep either).
I don't have the dependency-generating
rfc2817 seems to imply that CONNECT requests should include a `Host'
header, presumably with contents pretty much identical to the argument
of the CONNECT method.
The original CONNECT proposal by Luotonen didn't mention `Host' at
all. curl doesn't send it, while Mozilla does. I haven't checked
Thanks for the report; this problem is fixed in CVS. The workaround
is to wrap the appropriate init.c line in #ifdef HAVE_SSL.
Joachim Fahnenmueller [EMAIL PROTECTED] writes:
The page contains many links with PHP targets similar to above.
Wget downloaded all the linked files, pictures etc correctly, but
then I had two problems:
1. Some local links don't work. E. g. one of the downloaded pages is saved as
This problem has been fixed for the upcoming 1.10 release. If you
want to try it, it's available at
ftp://ftp.deepspace6.net/pub/ds6/sources/wget/wget-1.10-alpha2.tar.bz2
[EMAIL PROTECTED] writes:
Is there a publically accessible site that exhibits this problem?
I've set up a small example which illustrates the problem. Files can
be found at http://dev.mesca.net/wget/ (using demo:test as login).
Thanks for setting up this test case. It has uncovered at least
Arndt Humpert [EMAIL PROTECTED] writes:
wget, win32 rel. crashes with huge files.
Thanks for the report. This problem has been fixed in the latest
version, available at http://xoomer.virgilio.it/hherold/ .
Andrzej [EMAIL PROTECTED] writes:
I mirrored the chemfan site using those options:
wget -m -nv -k -K -E -nH --cut-dirs=1 -np -t 1000 -D wbc.lublin.pl -o
$HOME/logiwget/logchemfan.pl -P $HOME/web/chemfan.pl -p
http://znik.wbc.lublin.pl/ChemFan/
and unfortunately the links are not
Andrzej [EMAIL PROTECTED] writes:
Thus it seems that it should not matter what is the sequence of the
options. If it does I suggest that the developers of wget place
appriopriate info in the manual.
Yes, you right. Anyway I found out often that it's sometimes quite tricky
setting up
Andrzej [EMAIL PROTECTED] writes:
Multitude options in Wget is just an ilusion. In real life Wget
cannot cope with sites mirroring.
I agree with your criticism, if not with your tone. We are working on
improving Wget, and I believe that the problems you have seen will be
fixed in the
[ Moving discussion to wget@sunsite.dk ]
Doug Kaufman [EMAIL PROTECTED] writes:
I just grep'd again through the openssl distribution, and there is
no mention of the environment variables in any of the documentation,
just in the code itself.
If they are completely undocumented, is it wise to
Andrzej [EMAIL PROTECTED] writes:
Using the -p option should guarantee downloading of all the graphics etc:
wget -m -nv -k -K -E -nH -p -np -t 1000 -D andyk.feedle.com -o
$HOME/logiwget/logandyk -P /www/andyk http://andyk.feedle.com/
but it doesn't work here. Why?
Wget version installed
Andrzej [EMAIL PROTECTED] writes:
I want to mirror the website:
http://web.pertus.com.pl/~andyk/
into the directory
/www/wyciszanie.pc
I use it like that:
wget -m -nv -k -K -E -nH -p -np -t 1000 -D web.pertus.com.pl -o
$HOME/logiwget/logwyciszanie -P /www/wyciszanie.pc
first
--prefer-family=ipv6 respect order returned by getaddrinfo
--- doc/ChangeLog:
2005-04-24 Hrvoje Niksic [EMAIL PROTECTED]
* wget.texi (Download Options): Document --prefer-family.
--- src/ChangeLog
2005-04-24 Hrvoje Niksic [EMAIL PROTECTED]
* host.c
Andrzej [EMAIL PROTECTED] writes:
Because a ~andyk directory is in the URL. If you don't want it, use
either -nd or --cut-dirs=1, depending on whether you want to get rid
of the whole directory hierarchy or just that one dir.
--cut-dirs=1 solves the problem. Thanks.
BTW, why are there
Andrzej [EMAIL PROTECTED] writes:
-p should probably go to other sites as well by default, but it
doesn't do so yet.
Well, that's why I wanted to use this option for!
Sorry about that.
The manual reagarding the -p option says: Note that Wget will
behave as if -r had been specified, but
Doug Kaufman [EMAIL PROTECTED] writes:
All this made me look once again at the code for default certificate
locations in the openssl code and in the wget code. I think I need
to withdraw my suggestion for documentation of SSL_CERT_FILE and
SSL_CERT_DIR in the wget documentation, since a
Mathias Wittwer [EMAIL PROTECTED] writes:
ftp Want to authenticate on server. Is there any ftp
authentication build in?
Sure, use ftp://user:[EMAIL PROTECTED]/...
Do not see any options in wget --help
That is standard URL syntax, but it's also covered in the manual.
Mauro Tortonesi [EMAIL PROTECTED] writes:
i agree with you, hrvoje. we should fix the ssl options before the
1.10 release or we will have much bigger problems later.
OK. Thanks for your support. The SSL options have been submitted by
an external contributor, and I consider it my fault that I
Tanton Gibbs [EMAIL PROTECTED] writes:
I'm setting up a site for my company to allow people to get certain
files out of our company repository. Basically, I want people to be
able to write the following:
wget http://servername/~tgibbs/FileWanted.rpm
However, the files are stored
Thanks a lot for setting this up. I'll try to get Wget to log in.
BTW how are you running IIS on the Linux workstation? vmware?
Hrvoje Niksic [EMAIL PROTECTED] writes:
The following patch (now applied) fixes all the bugs you noticed
except #3. Which means that the infrastructure is all there, all the
right functions are called, we just need to figure out what we are
doing wrong.
Fixed now. It turns out the bug
[EMAIL PROTECTED] writes:
I think removing them is a bad idea. Even if very few people use
them, it good to have them. Personally, I've used --sslprotocol a
couple of times. IMO, all these choices are what makes Linux
console utils so powerful.
You're misunderstanding me: I'm not proposing
wrong. I'll look into it later, I don't have time right now.
2005-04-22 Hrvoje Niksic [EMAIL PROTECTED]
* http.c (gethttp): Handle multiple WWW-Authentication headers,
only one of which is recognized. Those are sent by IIS with NTLM
authorization
301 - 400 of 1457 matches
Mail list logo