Probably because you're the only one that thinks it is a problem, instead of
the way it needs to function? Nah, that couldn't be it.
Mark Post
-Original Message-
From: Andrzej Kasperowicz [mailto:[EMAIL PROTECTED]
Sent: Sunday, May 01, 2005 2:54 PM
To: Jens Rösner; wget@sunsite.dk
Sub
You might want to give Ibiblio a try (www.ibiblio.org). They host my
Slack/390 web/FTP site at no cost. They host a _bunch_ of sites at no
cost.
Mark Post
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Thursday, May 12, 2005 5:24 AM
To: wget@sunsite.dk
Subject:
TED]
Sent: Thursday, May 12, 2005 3:46 PM
To: Post, Mark K
Cc: wget@sunsite.dk
Subject: Re: Switching to subversion for version control
"Post, Mark K" <[EMAIL PROTECTED]> writes:
> You might want to give Ibiblio a try (www.ibiblio.org). They host my
> Slack/390 web/FTP site at
I read the entire message, but I probably didn't have to. My experience
with libtool in packages that really are building libraries has been
pretty painful. Since wget doesn't build any, getting rid of it is one
less thing to kill my builds in the future. Congratulations.
Mark Post
-Origi
This is the kind of obnoxious commentary I've learned to expect from
glibc's maintainers. It's no more becoming from you (or anyone else).
Buzz off.
Mark Post
-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Maciej W. Rozycki
Sent: Monday, June 27, 2005
You already blew that opportunity when you told us to shut up. Blame
yourself.
Mark Post
-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Maciej W. Rozycki
Sent: Monday, June 27, 2005 11:15 AM
To: Post, Mark K
Cc: wget@sunsite.dk
Subject: RE: No more
I hope that doesn't happen. While respecting robots.txt is not an
absolute requirement, it is considered polite. I would not want the
default behavior of wget to be considered impolite.
Mark Post
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: Monday, August 0
olite unless I tell them otherwise.
Mark Post
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: Monday, August 08, 2005 8:35 PM
To: Post, Mark K
Cc: [EMAIL PROTECTED]
Subject: Re: robots.txt takes precedence over -p
On Monday 08 August 2005 07:30 pm, Post, Mark K
In the past, I have been confused as to whether the file which was
generating the error was on the server, or on my local system. If there
is a way to distinguish between the two, and be more explicit, that
would be a little more helpful.
I don't see any way wget could/should do anything except r
Odd. It didn't take me long to find this:
http://ftp.us.debian.org/debian/pool/main/w/wget/wget_1.10.2-1_i386.deb
Mark Post
-Original Message-
From: Simeon Miteff [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 24, 2005 2:10 AM
To: [EMAIL PROTECTED]
Subject: retr.c:292: calc_rate: A
ck up new package versions from the stable channel.
Mark Post
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 24, 2005 4:43 PM
To: Post, Mark K
Cc: Simeon Miteff; [EMAIL PROTECTED]
Subject: Re: retr.c:292: calc_rate: Assertion `bytes >= 0
I think that a combination of --limit-rate and --wait parameters makes
this type of enhancement unnecessary, given that his stated purpose was
to not "hammer" a particular site.
Mark Post
-Original Message-
From: Mauro Tortonesi [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 30, 20
Try using the -np (no parent) parameter.
Mark Post
-Original Message-
From: bruce [mailto:[EMAIL PROTECTED]
Sent: Thursday, June 22, 2006 4:15 PM
To: 'Frank McCown'; wget@sunsite.dk
Subject: RE: wget - tracking urls/web crawling
hi frank...
there must be something simple i'm missing.
I'm trying to download parts of the SUSE Linux 10.1 tree. I'm going
after things below http://suse.mirrors.tds.net/pub/suse/update/10.1/,
but I want to exclude several directories in
http://suse.mirrors.tds.net/pub/suse/update/10.1/rpm/
In that directory are the following subdirectories:
i586/
i6
The short answer is that you don't get to do it. If your browser can't
do it, wget isn't going to be able to do it.
Mark Post
-Original Message-
From: news [mailto:[EMAIL PROTECTED] On Behalf Of Aditya Joshi
Sent: Friday, July 07, 2006 12:15 PM
To: wget@sunsite.dk
Subject: wget 403 for
You would want to use the -O option, and write a script to create a
unique file name to be passed to wget.
Mark Post
From: John McGill [mailto:[EMAIL PROTECTED]
Sent: Thursday, July 13, 2006 4:56 AMTo:
wget@sunsite.dkSubject: Wget
Hi,
I hope you can help wit
As requested, I am including the output from ./config-guess for my
Linux for S/390 system.
# ./config.guess
s390-ibm-linux
Version 1.5.3 works just fine on this system, although I am having
problems with 1.6 and 1.7, which I am detailing in a separate email.
Mark Post
I am having problems with both wget 1.6 and wget 1.7. I have a working wget
1.5.3 that I use a quite a lot.
When I compile wget 1.6 or 1.7, using either the -O2 (default) or -O1
parameters on gcc 2.95.2, I get segmentation faults as follows:
# wget -m -nd
ftp://ftp.slackware.com/pub/slackware/
Jan,
Did you ever make any progress on this?
Mark Post
-Original Message-
From: Jan Prikryl [mailto:[EMAIL PROTECTED]]
Sent: Thursday, July 19, 2001 1:53 PM
To: Post, Mark K
Cc: Wget mailing list
Subject: Re: Segfault on Linux/390 for wget 1.6 and 1.7
Quoting Post, Mark K ([EMAIL
Our firewall group just changed the proxy they use for ftp. As a result, a
problem with wget 1.8.2 surfaced. When I issue this command:
wget -N ftp://ftp.slackware.com/pub/slackware/slackware-9.0/ChangeLog.txt
the results I get depend on whether ChangeLog.txt exists in the current
working directo
I see the exact same problem:
$ wget --debug -m
ftp://ftp-linux.cc.gatech.edu/slackware/slackware-9.0/patches/
DEBUG output created by Wget 1.8.2 on linux-gnu.
--14:25:49-- ftp://ftp-linux.cc.gatech.edu/slackware/slackware-9.0/patches/
=>
`ftp-linux.cc.gatech.edu/slackware/slackware-9
Hans,
I'm investigating this as a proxy server problem. When I ran some tests, it
appeared as though the HEAD command from wget was getting "translated" into
a series of commands to query the size, MDMT of the file, etc., but then I
was seeing a STOR command come from the proxy server, which was
he's still
interested in working on wget).
Mark
-Original Message-
From: Hans Deragon (QA/LMC) [mailto:[EMAIL PROTECTED]
Sent: Monday, July 14, 2003 1:05 PM
To: 'Post, Mark K'
Subject: RE: wget: ftp through http proxy not working with 1.8.2. It doe
s work with 1.5.3
This works:
telnet 132.163.4.101 14 | tail -n 1 > outfile
But so does
wget http://tycho.usno.navy.mil/cgi-bin/timer.pl
if we want to bring this back to being on topic. :)
Mark Post
-Original Message-
From: Nicolas Schodet [mailto:[EMAIL PROTECTED]
Sent: Tuesday, July 22, 2003
I just issued this command on my Slackware 9.0 system:
~$ wget
http://www.lsc.hu/debian/pool/main/gcc-3.2-base_1%253a3.2.3-0pre9_i386.deb
--18:18:55--
http://www.lsc.hu/debian/pool/main/gcc-3.2-base_1%253a3.2.3-0pre9_i386.deb
=> `gcc-3.2-base_1%253a3.2.3-0pre9_i386.deb'
Resolving interne
Morris,
It looks like you're going through a proxying firewall, correct? If so, a
number of us have seen the same problem. There's been no response from the
maintainer, so far, so no resolution is in sight.
Mark Post
-Original Message-
From: Morris Hooten - SES System Admin [mailto:[E
Morris,
Correct. I and the others see the "index.html" file downloaded, and then
things stop.
Mark Post
-Original Message-
From: Morris Hooten - SES System Admin [mailto:[EMAIL PROTECTED]
Sent: Friday, July 25, 2003 3:36 PM
To: Morris Hooten - SES System Admin
Cc: Post, Mark
Preston,
The "--ignore-length" option _may_ do what you want. As Aaron pointed out,
if they update the file on the remote server after the photo has been
updated locally, it will get wiped out based on the date, not the length.
So, perhaps you need to modify your work practices rather than diddle
Does the PATH of procmail contain the directory where wget lives?
Mark Post
-Original Message-
From: Michel Lombart [mailto:[EMAIL PROTECTED]
Sent: Tuesday, July 29, 2003 6:51 PM
To: [EMAIL PROTECTED]
Subject: wget and procmail
Hello,
I've an issue with wget and procmail.
I install t
Other than the "--ignore-length" option I mentioned previously, no. Sorry.
Mark Post
-Original Message-
From: Preston [mailto:[EMAIL PROTECTED]
Sent: Tuesday, July 29, 2003 7:01 PM
To: [EMAIL PROTECTED]
Subject: Re: -N option
Aaron S. Hawley wrote:
>On Tue, 29 Jul 2003,
Sanjeev,
You're going to need to do a couple of things.
1. Create a wgetrc file. In it, put a value for http_proxy= that will
specify your
http proxy server name, and port number. You can put this in the form of:
http_proxy=http://sanjeevs:[EMAIL PROTECTED]:portnum/
Or, you can specify the proxy
Angelo,
It works for me:
# wget -N http://www.nic.it/index.html
--13:04:39-- http://www.nic.it/index.html
=> `index.html'
Resolving www.nic.it... done.
Connecting to www.nic.it[193.205.245.10]:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 2,474 [text/html]
10
Steve,
Two ways. Either edit the /etc/wgetrc file and add the information there.
(There are commented-out samples in the file.) Alternately, copy /etc/wget
to ~/.wgetrc and do it there.
Or, have an environment variable defined for the particular protocol:
http_proxy=
https_proxy=
ftp_proxy=
No
man wget shows:
-D domain-list
--domains=domain-list
Set domains to be followed. domain-list is a comma-separated
list of domains.
Note that it does not turn on -H.
Mark Post
-Original Message-
From: Andrzej Kasperowicz [mailto:[EMAIL PROTECTED]
Sent:
It's always been my experience when specifying -m that wget does follow
across domains by default. I've always had to tell it not to do that.
Mark Post
-Original Message-
From: Andrzej Kasperowicz [mailto:[EMAIL PROTECTED]
Sent: Monday, August 18, 2003 4:02 PM
To: Post, Mark
If this is a non-transparent proxy, you do indeed need to use the proxy
parameters:
--proxy-user=user
--proxy-passwd=password
As well as set the proxy server environment variables
ftp_proxy=http://proxy.server.name[:port] <- Note the http:// value. That
is correct.
http_proxy=http://proxy.server
Absolutely. I would much rather get an intelligent error message stating
that ipv6 addresses are not supported, versus a misleading one about the
host not being found. That would save end-users a whole lot of wasted time.
Mark Post
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL
No, it won't. The javascript stuff makes sure of that.
Mark Post
-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
Sent: Tuesday, September 09, 2003 4:32 PM
To: [EMAIL PROTECTED]
Subject: wget -r -p -k -l 5 www.protcast.com doesnt pull some images
though they are part
Perhaps, but it is kind of nice to get that information from the program
itself at the same time you get the version information. For example:
# ssh -V
OpenSSH_3.7p1, SSH protocols 1.5/2.0, OpenSSL 0.9.7b 10 Apr 2003
All the information, from one place.
Mark Post
-Original Message-
Fro
Do you see the missing symbol when you do an "nm -D" command against either
libssl.so or libcrypto.so? (It shows up on my Linux system in
libcrypto.so.)
Mark Post
-Original Message-
From: Robert Poole [mailto:[EMAIL PROTECTED]
Sent: Sunday, October 12, 2003 2:23 PM
To: [EMAIL PROTECTED]
I'm a little confused. OpenSSL is licensed pretty much the same as Apache.
What's the GPL issue with that style of license?
Mark Post
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 05, 2003 8:27 AM
To: [EMAIL PROTECTED]
Subject: GNU TLS vs. Op
You can do this now:
wget http://216.46.192.85/
Using DNS is just a convenience after all, not a requirement.
Mark Post
-Original Message-
From: Dan Jacobson [mailto:[EMAIL PROTECTED]
Sent: Saturday, November 15, 2003 4:00 PM
To: [EMAIL PROTECTED]
Subject: feature request: --second-gue
Use the -np or --no-parent option.
Mark Post
-Original Message-
From: Josh Brooks [mailto:[EMAIL PROTECTED]
Sent: Sunday, November 16, 2003 11:48 PM
To: [EMAIL PROTECTED]
Subject: how to get mirror just a portion of a website ?
Generally, I mirror an entire web site with:
wget --trie
That is _really_ ugly, and perhaps immoral. Make it an option, if you must.
Certainly don't make it the default behavior.
Shudder
Mark Post
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 19, 2003 4:59 PM
To: Peter GILMAN
Cc: [EMAIL PROTEC
o one long
string and see if wget still works well.) I just find the whole idea
abhorrent to start with.
Mark Post
-Original Message-
From: Hrvoje Niksic [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 19, 2003 6:04 PM
To: Post, Mark K
Cc: Peter GILMAN; [EMAIL PROTECTED]
Subjec
I tested the Windows binary against the only SSL-enabled web server outside
our firewall that I could think of at the moment, and it worked for me.
Mark Post
-Original Message-
From: Herold Heiko [mailto:[EMAIL PROTECTED]
Sent: Friday, November 28, 2003 3:18 AM
To: [EMAIL PROTECTED]
Cc: L
Because the URL has special characters in it, surround it in double quotes:
wget
"http://quicktake.morningstar.com/Stock/Income10.asp?Country=USA&Symbol=JNJ&;
stocktab=finance"
Mark Post
-Original Message-
From: David C. [mailto:[EMAIL PROTECTED]
Sent: Friday, January 09, 2004 2:01 AM
To
Yes, it should be.
Mark Post
-Original Message-
From: Cui, Byron [mailto:[EMAIL PROTECTED]
Sent: Tuesday, January 13, 2004 11:57 AM
To: [EMAIL PROTECTED]
Subject: wget -- ftp with proxy
Hi,
If use ftp through proxy, would the passive-ftp option still be valid?
Thanks.
Byron Cui
e-
Well, that's what you're telling it to do with the -S option, so why are you
surprised? "man wget", then "/-S"
Mark Post
-Original Message-
From: Simons, Rick [mailto:[EMAIL PROTECTED]
Sent: Wednesday, January 21, 2004 11:09 AM
To: '[EMAIL PROTECTED]'
Subject: RE: Syntax question ...
I
It's more likely your system/shell that is doing it, if you're using Linux
or UNIX.
wget -r -l 0 ftp://19.24.24.24/some/datase/C\#Tool/
Mark Post
-Original Message-
From: Peter Mikeska [mailto:[EMAIL PROTECTED]
Sent: Thursday, January 22, 2004 6:28 PM
To: [EMAIL PROTECTED]
Subject: prob
Title: Message
It's a known
bug. I'm waiting for a fix for it myself.
Mark
Post
-Original Message-From: Lawrance, Mark
[mailto:[EMAIL PROTECTED] Sent: Wednesday, May 12,
2004 9:09 AMTo: [EMAIL PROTECTED]Subject: GNU Wget
1.9.1
GNU Wget 1.9.1
The non-interactive download uti
I don't believe so. You might want to take a look at rsync instead. It
does a very nice job of doing just what you need.
Mark Post
-Original Message-
From: Kathryn Moretz [mailto:[EMAIL PROTECTED]
Sent: Thursday, April 29, 2004 4:40 PM
To: [EMAIL PROTECTED]
Cc: Kathryn Moretz
Subject:
Are you behind a firewall or proxy of some kind? If so, you might want to
try using passive FTP mode.
Mark Post
-Original Message-
From: Phillip Pi [mailto:[EMAIL PROTECTED]
Sent: Thursday, May 20, 2004 3:08 PM
To: [EMAIL PROTECTED]
Subject: RE: wget hangs or downloads end up incomplet
Then you haven't looked at enough web sites. Whenever tidydbg (from w3.org)
tells me to do that in one of my URLs, I do that. I've got one page of
links that has tons of them. They work. Can we stop arguing about this
off-topic bit now?
Mark Post
-Original Message-
From: Tony Lewis [
Title: Message
I can't answer your
question as such, but I know the "ssh-keygen" command will allow you to create a
key pair with a null passphrase. There is also a means with the openssl
commands to do the same thing, but I lost my bookmark to the webpage where I
found out how to do that.
Yeah, you're both right. While we're at it, why don't we just round off the
value of pi to be 3.0. Those pesky trailing decimals are just an accident
of history anyway.
-Original Message-
From: Carlos Villegas [mailto:[EMAIL PROTECTED]
Sent: Thursday, December 23, 2004 8:22 PM
To: Tony
No, but that particular bit of idiocy was the inspiration for my comment. I
just took it one decimal point further.
-Original Message-
From: Tony Lewis [mailto:[EMAIL PROTECTED]
Sent: Friday, December 24, 2004 2:22 AM
To: wget@sunsite.dk
Subject: RE: Metric units
Mark Post wrote:
>
Title: Message
Put the URL in
double quotes. That worked for me.
Mark
Post
-Original Message-From: szulevzs
[mailto:[EMAIL PROTECTED] Sent: Sunday, December 26, 2004 5:23
AMTo: [EMAIL PROTECTED]Subject: bug
WGET can not download the following
link:
Wget --tries=5 http://ex
wget -m -np http://url.to.download/something/group-a/want-to-download/ \
http://url.to.download/something/group-b/want-to-download/ \
http://url.to.download/something/group-c/want-to-download/
Mark Post
-Original Message-
From: Gabor Istvan [mailto:[EMAIL PROTECTED]
Sent: Friday, Januar
Title: RE: "403 Forbidden" Errors with mac.com
Don't know what is happening on your end. I just executed
wget http://idisk.mac.com/tombb/Public/tex-edit-plus-X.sit
and it downloaded 2,484,062 bytes of something.
What does using the -d option show you?
Mark Post
-Original Message-
it without that, since that works for me.
Mark Post
-Original Message-
From: Emily Jackson [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, February 08, 2005 5:55 PM
To: Post, Mark K
Cc: wget@sunsite.dk
Subject: Re: "403 Forbidden" Errors with mac.com
On Tue, 8 Feb 2005 12:0
I don't know why you say that. I see bug reports and discussion of fixes
flowing through here on a fairly regular basis.
Mark Post
-Original Message-
From: Dan Jacobson [mailto:[EMAIL PROTECTED]
Sent: Tuesday, March 15, 2005 3:04 PM
To: [EMAIL PROTECTED]
Subject: bug-wget still useful
62 matches
Mail list logo