RE: Wget

2006-07-13 Thread Post, Mark K
You would want to use the -O option, and write a script to create a unique file name to be passed to wget. Mark Post From: John McGill [mailto:[EMAIL PROTECTED] Sent: Thursday, July 13, 2006 4:56 AMTo: wget@sunsite.dkSubject: Wget Hi, I hope you can help with a

RE: wget 403 forbidden error when no index.html.

2006-07-07 Thread Post, Mark K
The short answer is that you don't get to do it. If your browser can't do it, wget isn't going to be able to do it. Mark Post -Original Message- From: news [mailto:[EMAIL PROTECTED] On Behalf Of Aditya Joshi Sent: Friday, July 07, 2006 12:15 PM To: wget@sunsite.dk Subject: wget 403

Excluding directories

2006-06-26 Thread Post, Mark K
I'm trying to download parts of the SUSE Linux 10.1 tree. I'm going after things below http://suse.mirrors.tds.net/pub/suse/update/10.1/, but I want to exclude several directories in http://suse.mirrors.tds.net/pub/suse/update/10.1/rpm/ In that directory are the following subdirectories: i586/

RE: wget - tracking urls/web crawling

2006-06-22 Thread Post, Mark K
Try using the -np (no parent) parameter. Mark Post -Original Message- From: bruce [mailto:[EMAIL PROTECTED] Sent: Thursday, June 22, 2006 4:15 PM To: 'Frank McCown'; wget@sunsite.dk Subject: RE: wget - tracking urls/web crawling hi frank... there must be something simple i'm

RE: Limit time to run

2005-11-30 Thread Post, Mark K
I think that a combination of --limit-rate and --wait parameters makes this type of enhancement unnecessary, given that his stated purpose was to not hammer a particular site. Mark Post -Original Message- From: Mauro Tortonesi [mailto:[EMAIL PROTECTED] Sent: Wednesday, November 30,

RE: retr.c:292: calc_rate: Assertion `bytes = 0' failed.

2005-11-24 Thread Post, Mark K
Odd. It didn't take me long to find this: http://ftp.us.debian.org/debian/pool/main/w/wget/wget_1.10.2-1_i386.deb Mark Post -Original Message- From: Simeon Miteff [mailto:[EMAIL PROTECTED] Sent: Thursday, November 24, 2005 2:10 AM To: [EMAIL PROTECTED] Subject: retr.c:292: calc_rate:

RE: retr.c:292: calc_rate: Assertion `bytes = 0' failed.

2005-11-24 Thread Post, Mark K
new package versions from the stable channel. Mark Post -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Thursday, November 24, 2005 4:43 PM To: Post, Mark K Cc: Simeon Miteff; [EMAIL PROTECTED] Subject: Re: retr.c:292: calc_rate: Assertion `bytes = 0' failed

RE: wget displays permission error

2005-09-01 Thread Post, Mark K
In the past, I have been confused as to whether the file which was generating the error was on the server, or on my local system. If there is a way to distinguish between the two, and be more explicit, that would be a little more helpful. I don't see any way wget could/should do anything except

RE: robots.txt takes precedence over -p

2005-08-08 Thread Post, Mark K
I hope that doesn't happen. While respecting robots.txt is not an absolute requirement, it is considered polite. I would not want the default behavior of wget to be considered impolite. Mark Post -Original Message- From: Mauro Tortonesi [mailto:[EMAIL PROTECTED] Sent: Monday, August

RE: robots.txt takes precedence over -p

2005-08-08 Thread Post, Mark K
unless I tell them otherwise. Mark Post -Original Message- From: Mauro Tortonesi [mailto:[EMAIL PROTECTED] Sent: Monday, August 08, 2005 8:35 PM To: Post, Mark K Cc: [EMAIL PROTECTED] Subject: Re: robots.txt takes precedence over -p On Monday 08 August 2005 07:30 pm, Post, Mark K wrote: I

RE: No more Libtool (long)

2005-06-27 Thread Post, Mark K
This is the kind of obnoxious commentary I've learned to expect from glibc's maintainers. It's no more becoming from you (or anyone else). Buzz off. Mark Post -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Maciej W. Rozycki Sent: Monday, June 27, 2005

RE: No more Libtool (long)

2005-06-27 Thread Post, Mark K
You already blew that opportunity when you told us to shut up. Blame yourself. Mark Post -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Maciej W. Rozycki Sent: Monday, June 27, 2005 11:15 AM To: Post, Mark K Cc: wget@sunsite.dk Subject: RE: No more

RE: No more Libtool (long)

2005-06-25 Thread Post, Mark K
I read the entire message, but I probably didn't have to. My experience with libtool in packages that really are building libraries has been pretty painful. Since wget doesn't build any, getting rid of it is one less thing to kill my builds in the future. Congratulations. Mark Post

RE: Switching to subversion for version control

2005-05-12 Thread Post, Mark K
You might want to give Ibiblio a try (www.ibiblio.org). They host my Slack/390 web/FTP site at no cost. They host a _bunch_ of sites at no cost. Mark Post -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Thursday, May 12, 2005 5:24 AM To: wget@sunsite.dk

RE: Switching to subversion for version control

2005-05-12 Thread Post, Mark K
: Thursday, May 12, 2005 3:46 PM To: Post, Mark K Cc: wget@sunsite.dk Subject: Re: Switching to subversion for version control Post, Mark K [EMAIL PROTECTED] writes: You might want to give Ibiblio a try (www.ibiblio.org). They host my Slack/390 web/FTP site at no cost. They host a _bunch_ of sites

RE: links conversion; non-existent index.html

2005-05-01 Thread Post, Mark K
Probably because you're the only one that thinks it is a problem, instead of the way it needs to function? Nah, that couldn't be it. Mark Post -Original Message- From: Andrzej Kasperowicz [mailto:[EMAIL PROTECTED] Sent: Sunday, May 01, 2005 2:54 PM To: Jens Rösner; wget@sunsite.dk

RE: bug-wget still useful

2005-03-15 Thread Post, Mark K
I don't know why you say that. I see bug reports and discussion of fixes flowing through here on a fairly regular basis. Mark Post -Original Message- From: Dan Jacobson [mailto:[EMAIL PROTECTED] Sent: Tuesday, March 15, 2005 3:04 PM To: [EMAIL PROTECTED] Subject: bug-wget still

RE: 403 Forbidden Errors with mac.com

2005-02-08 Thread Post, Mark K
Title: RE: 403 Forbidden Errors with mac.com Don't know what is happening on your end. I just executed wget http://idisk.mac.com/tombb/Public/tex-edit-plus-X.sit and it downloaded 2,484,062 bytes of something. What does using the -d option show you? Mark Post -Original

RE: selective recursive downloading

2005-01-21 Thread Post, Mark K
wget -m -np http://url.to.download/something/group-a/want-to-download/ \ http://url.to.download/something/group-b/want-to-download/ \ http://url.to.download/something/group-c/want-to-download/ Mark Post -Original Message- From: Gabor Istvan [mailto:[EMAIL PROTECTED] Sent: Friday,

RE: Metric units

2004-12-23 Thread Post, Mark K
Yeah, you're both right. While we're at it, why don't we just round off the value of pi to be 3.0. Those pesky trailing decimals are just an accident of history anyway. -Original Message- From: Carlos Villegas [mailto:[EMAIL PROTECTED] Sent: Thursday, December 23, 2004 8:22 PM To:

RE: Metric units

2004-12-23 Thread Post, Mark K
No, but that particular bit of idiocy was the inspiration for my comment. I just took it one decimal point further. -Original Message- From: Tony Lewis [mailto:[EMAIL PROTECTED] Sent: Friday, December 24, 2004 2:22 AM To: wget@sunsite.dk Subject: RE: Metric units Mark Post wrote:

RE: Escaping semicolons (actually Ampersands)

2004-06-29 Thread Post, Mark K
Then you haven't looked at enough web sites. Whenever tidydbg (from w3.org) tells me to do that in one of my URLs, I do that. I've got one page of links that has tons of them. They work. Can we stop arguing about this off-topic bit now? Mark Post -Original Message- From: Tony Lewis

RE: wget hangs or downloads end up incomplete in Windows 2000 X P.

2004-05-20 Thread Post, Mark K
Are you behind a firewall or proxy of some kind? If so, you might want to try using passive FTP mode. Mark Post -Original Message- From: Phillip Pi [mailto:[EMAIL PROTECTED] Sent: Thursday, May 20, 2004 3:08 PM To: [EMAIL PROTECTED] Subject: RE: wget hangs or downloads end up

RE: GNU Wget 1.9.1

2004-05-12 Thread Post, Mark K
Title: Message It's a known bug. I'm waiting for a fix for it myself. Mark Post -Original Message-From: Lawrance, Mark [mailto:[EMAIL PROTECTED] Sent: Wednesday, May 12, 2004 9:09 AMTo: [EMAIL PROTECTED]Subject: GNU Wget 1.9.1 GNU Wget 1.9.1 The non-interactive download

RE: Preserving file ownership

2004-05-01 Thread Post, Mark K
I don't believe so. You might want to take a look at rsync instead. It does a very nice job of doing just what you need. Mark Post -Original Message- From: Kathryn Moretz [mailto:[EMAIL PROTECTED] Sent: Thursday, April 29, 2004 4:40 PM To: [EMAIL PROTECTED] Cc: Kathryn Moretz

RE: problem with # in path

2004-01-22 Thread Post, Mark K
It's more likely your system/shell that is doing it, if you're using Linux or UNIX. wget -r -l 0 ftp://19.24.24.24/some/datase/C\#Tool/ Mark Post -Original Message- From: Peter Mikeska [mailto:[EMAIL PROTECTED] Sent: Thursday, January 22, 2004 6:28 PM To: [EMAIL PROTECTED] Subject:

RE: Syntax question ...

2004-01-21 Thread Post, Mark K
Well, that's what you're telling it to do with the -S option, so why are you surprised? man wget, then /-S Mark Post -Original Message- From: Simons, Rick [mailto:[EMAIL PROTECTED] Sent: Wednesday, January 21, 2004 11:09 AM To: '[EMAIL PROTECTED]' Subject: RE: Syntax question ... I

RE: wget -- ftp with proxy

2004-01-13 Thread Post, Mark K
Yes, it should be. Mark Post -Original Message- From: Cui, Byron [mailto:[EMAIL PROTECTED] Sent: Tuesday, January 13, 2004 11:57 AM To: [EMAIL PROTECTED] Subject: wget -- ftp with proxy Hi, If use ftp through proxy, would the passive-ftp option still be valid? Thanks. Byron Cui

RE: wget can't get the following site

2004-01-09 Thread Post, Mark K
Because the URL has special characters in it, surround it in double quotes: wget http://quicktake.morningstar.com/Stock/Income10.asp?Country=USASymbol=JNJ; stocktab=finance Mark Post -Original Message- From: David C. [mailto:[EMAIL PROTECTED] Sent: Friday, January 09, 2004 2:01 AM To:

RE: SSL over proxy passthrough

2003-11-28 Thread Post, Mark K
I tested the Windows binary against the only SSL-enabled web server outside our firewall that I could think of at the moment, and it worked for me. Mark Post -Original Message- From: Herold Heiko [mailto:[EMAIL PROTECTED] Sent: Friday, November 28, 2003 3:18 AM To: [EMAIL PROTECTED] Cc:

RE: problem with LF/CR etc.

2003-11-19 Thread Post, Mark K
That is _really_ ugly, and perhaps immoral. Make it an option, if you must. Certainly don't make it the default behavior. Shudder Mark Post -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Wednesday, November 19, 2003 4:59 PM To: Peter GILMAN Cc: [EMAIL

RE: how to get mirror just a portion of a website ?

2003-11-16 Thread Post, Mark K
Use the -np or --no-parent option. Mark Post -Original Message- From: Josh Brooks [mailto:[EMAIL PROTECTED] Sent: Sunday, November 16, 2003 11:48 PM To: [EMAIL PROTECTED] Subject: how to get mirror just a portion of a website ? Generally, I mirror an entire web site with: wget

RE: feature request: --second-guess-the-dns

2003-11-15 Thread Post, Mark K
You can do this now: wget http://216.46.192.85/ Using DNS is just a convenience after all, not a requirement. Mark Post -Original Message- From: Dan Jacobson [mailto:[EMAIL PROTECTED] Sent: Saturday, November 15, 2003 4:00 PM To: [EMAIL PROTECTED] Subject: feature request:

RE: Compile and link problems with wget 1.9 beta5

2003-10-12 Thread Post, Mark K
Do you see the missing symbol when you do an nm -D command against either libssl.so or libcrypto.so? (It shows up on my Linux system in libcrypto.so.) Mark Post -Original Message- From: Robert Poole [mailto:[EMAIL PROTECTED] Sent: Sunday, October 12, 2003 2:23 PM To: [EMAIL PROTECTED]

RE: Small change to print SSL version

2003-09-17 Thread Post, Mark K
Perhaps, but it is kind of nice to get that information from the program itself at the same time you get the version information. For example: # ssh -V OpenSSH_3.7p1, SSH protocols 1.5/2.0, OpenSSL 0.9.7b 10 Apr 2003 All the information, from one place. Mark Post -Original Message-

RE: wget -r -p -k -l 5 www.protcast.com doesnt pull some images t hough they are part of the HREF

2003-09-09 Thread Post, Mark K
No, it won't. The javascript stuff makes sure of that. Mark Post -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Sent: Tuesday, September 09, 2003 4:32 PM To: [EMAIL PROTECTED] Subject: wget -r -p -k -l 5 www.protcast.com doesnt pull some images though they are

RE: rfc2732 patch for wget

2003-09-08 Thread Post, Mark K
Absolutely. I would much rather get an intelligent error message stating that ipv6 addresses are not supported, versus a misleading one about the host not being found. That would save end-users a whole lot of wasted time. Mark Post -Original Message- From: Hrvoje Niksic [mailto:[EMAIL

RE: wget and 2 users / passwords to get through?

2003-08-20 Thread Post, Mark K
If this is a non-transparent proxy, you do indeed need to use the proxy parameters: --proxy-user=user --proxy-passwd=password As well as set the proxy server environment variables ftp_proxy=http://proxy.server.name[:port] - Note the http:// value. That is correct.

RE: wget is mirroring whole internet instead of just my web page!

2003-08-18 Thread Post, Mark K
man wget shows: -D domain-list --domains=domain-list Set domains to be followed. domain-list is a comma-separated list of domains. Note that it does not turn on -H. Mark Post -Original Message- From: Andrzej Kasperowicz [mailto:[EMAIL PROTECTED]

RE: wget is mirroring whole internet instead of just my web page!

2003-08-18 Thread Post, Mark K
It's always been my experience when specifying -m that wget does follow across domains by default. I've always had to tell it not to do that. Mark Post -Original Message- From: Andrzej Kasperowicz [mailto:[EMAIL PROTECTED] Sent: Monday, August 18, 2003 4:02 PM To: Post, Mark K; [EMAIL

RE: Wget 1.8.2 timestamping bug

2003-08-06 Thread Post, Mark K
Angelo, It works for me: # wget -N http://www.nic.it/index.html --13:04:39-- http://www.nic.it/index.html = `index.html' Resolving www.nic.it... done. Connecting to www.nic.it[193.205.245.10]:80... connected. HTTP request sent, awaiting response... 200 OK Length: 2,474 [text/html]

RE: wget and procmail

2003-07-29 Thread Post, Mark K
Does the PATH of procmail contain the directory where wget lives? Mark Post -Original Message- From: Michel Lombart [mailto:[EMAIL PROTECTED] Sent: Tuesday, July 29, 2003 6:51 PM To: [EMAIL PROTECTED] Subject: wget and procmail Hello, I've an issue with wget and procmail. I install

RE: -N option

2003-07-29 Thread Post, Mark K
Other than the --ignore-length option I mentioned previously, no. Sorry. Mark Post -Original Message- From: Preston [mailto:[EMAIL PROTECTED] Sent: Tuesday, July 29, 2003 7:01 PM To: [EMAIL PROTECTED] Subject: Re: -N option Aaron S. Hawley wrote: On Tue, 29 Jul 2003, Post, Mark K

RE: wget: ftp through http proxy not working with 1.8.2. It does work with 1.5.3

2003-07-14 Thread Post, Mark K
Hans, I'm investigating this as a proxy server problem. When I ran some tests, it appeared as though the HEAD command from wget was getting translated into a series of commands to query the size, MDMT of the file, etc., but then I was seeing a STOR command come from the proxy server, which was

RE: wget: ftp through http proxy not working with 1.8.2. It does work with 1.5.3

2003-07-14 Thread Post, Mark K
still interested in working on wget). Mark -Original Message- From: Hans Deragon (QA/LMC) [mailto:[EMAIL PROTECTED] Sent: Monday, July 14, 2003 1:05 PM To: 'Post, Mark K' Subject: RE: wget: ftp through http proxy not working with 1.8.2. It doe s work with 1.5.3 wget --debug -m ftp

RE: Segfault on Linux/390 for wget 1.6 and 1.7

2001-10-03 Thread Post, Mark K
Jan, Did you ever make any progress on this? Mark Post -Original Message- From: Jan Prikryl [mailto:[EMAIL PROTECTED]] Sent: Thursday, July 19, 2001 1:53 PM To: Post, Mark K Cc: Wget mailing list Subject: Re: Segfault on Linux/390 for wget 1.6 and 1.7 Quoting Post, Mark K ([EMAIL

Addition to MACHINES File

2001-07-19 Thread Post, Mark K
As requested, I am including the output from ./config-guess for my Linux for S/390 system. # ./config.guess s390-ibm-linux Version 1.5.3 works just fine on this system, although I am having problems with 1.6 and 1.7, which I am detailing in a separate email. Mark Post

Segfault on Linux/390 for wget 1.6 and 1.7

2001-07-19 Thread Post, Mark K
I am having problems with both wget 1.6 and wget 1.7. I have a working wget 1.5.3 that I use a quite a lot. When I compile wget 1.6 or 1.7, using either the -O2 (default) or -O1 parameters on gcc 2.95.2, I get segmentation faults as follows: # wget -m -nd