Re: wget-1.8.2

2003-10-10 Thread Hrvoje Niksic
It seems that you're behind a firewall and need to use passive ftp. Try the `--passive-ftp' flag, or specify `passive_ftp = on' in ~/.wgetrc.

Re: wget checks timestamp on wrong file

2003-10-09 Thread Hrvoje Niksic
It's a bug. -O currently doesn't work everywhere in should. If you just want to change the directory where Wget operates, the workaround is to use `-P'. E.g.: wget -N ftp://ftp.pld-linux.org/dists/ac/PLD/athlon/PLD/RPMS/packages.dir.mdd -P

Re: wget ipv6 patch

2003-10-08 Thread Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes: so, i am asking you: what do you think of these changes? Overall they look very good! Judging from the patch, a large piece of the work part seems to be in an unexpected place: the FTP code. Here are some remarks I got looking at the patch. It

Re: wget ipv6 patch

2003-10-08 Thread Mauro Tortonesi
On Wed, 8 Oct 2003, Hrvoje Niksic wrote: Mauro Tortonesi [EMAIL PROTECTED] writes: so, i am asking you: what do you think of these changes? Overall they look very good! Judging from the patch, a large piece of the work part seems to be in an unexpected place: the FTP code. yes, i have

Re: wget ipv6 patch

2003-10-08 Thread Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes: I still don't understand the choice to use sockaddr and sockaddr_storage in a application code. They result in needless casts and (to me) uncomprehensible code. well, using sockaddr_storage is the right way (TM) to write IPv6 enabled code ;-) Not

Re: wget ipv6 patch

2003-10-08 Thread Draen Kaar
Mauro Tortonesi wrote: are there __REALLY__ systems which do not support inet_aton? their ISVs should be ashamed of themselves... Solaris, for example. IIRC inet_aton isn't in any document which claims to be a standard. however, yours seemed to me an ugly hack, so i have temporarily removed

Re: wget 1.9 - behaviour change in recursive downloads

2003-10-07 Thread Jochen Roderburg
Zitat von Hrvoje Niksic [EMAIL PROTECTED]: Jochen Roderburg [EMAIL PROTECTED] writes: Zitat von Hrvoje Niksic [EMAIL PROTECTED]: It's a feature. `-A zip' means `-A zip', not `-A zip,html'. Wget downloads the HTML files only because it absolutely has to, in order to recurse through

Re: wget 1.9 - behaviour change in recursive downloads

2003-10-03 Thread Hrvoje Niksic
It's a feature. `-A zip' means `-A zip', not `-A zip,html'. Wget downloads the HTML files only because it absolutely has to, in order to recurse through them. After it finds the links in them, it deletes them.

Re: wget 1.9 - behaviour change in recursive downloads

2003-10-03 Thread Jochen Roderburg
Zitat von Hrvoje Niksic [EMAIL PROTECTED]: It's a feature. `-A zip' means `-A zip', not `-A zip,html'. Wget downloads the HTML files only because it absolutely has to, in order to recurse through them. After it finds the links in them, it deletes them. Hmm, so it has really been an

Re: wget 1.9 - behaviour change in recursive downloads

2003-10-03 Thread Fred Holmes
At 12:05 PM 10/3/2003, Hrvoje Niksic wrote: It's a feature. `-A zip' means `-A zip', not `-A zip,html'. Wget downloads the HTML files only because it absolutely has to, in order to recurse through them. After it finds the links in them, it deletes them. How about a switch to keep the .html

Re: wget 1.9 - behaviour change in recursive downloads

2003-10-03 Thread Hrvoje Niksic
Jochen Roderburg [EMAIL PROTECTED] writes: Zitat von Hrvoje Niksic [EMAIL PROTECTED]: It's a feature. `-A zip' means `-A zip', not `-A zip,html'. Wget downloads the HTML files only because it absolutely has to, in order to recurse through them. After it finds the links in them, it deletes

Re: Wget 1.9-beta2 is available for testing

2003-10-01 Thread DervishD
Hi Hrvoje :) * Hrvoje Niksic [EMAIL PROTECTED] dixit: This beta includes several important bug fixes since 1.9-beta1, most notably the fix for correct file name quoting with recursive FTP downloads. That works, at least for me. I've tested with the ftp repository that previously

Re: wget bug

2003-09-26 Thread DervishD
Hi Jack :) * Jack Pavlovsky [EMAIL PROTECTED] dixit: It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg

Re: wget bug

2003-09-26 Thread Hrvoje Niksic
Jack Pavlovsky [EMAIL PROTECTED] writes: It's probably a bug: bug: when downloading wget -mirror ftp://somehost.org/somepath/3acv14~anivcd.mpg, wget saves it as-is, but when downloading wget ftp://somehost.org/somepath/3*, wget saves the files as 3acv14%7Eanivcd.mpg Thanks for the report.

RE: Wget 1.9-beta1 is available for testing

2003-09-25 Thread Herold Heiko
Hrvoje, please add this patch: --- wget-1.9-beta1\windows\Makefile.src Sat May 18 02:16:36 2002 +++ wget-1.9-beta1.wip\windows\Makefile.src Thu Sep 25 08:09:26 2003 @@ -63,15 +63,17 @@ RM = del -SRC = cmpt.c safe-ctype.c connect.c host.c http.c netrc.c ftp-basic.c ftp.c \ -

RE: Wget 1.9-beta1 is available for testing

2003-09-25 Thread Herold Heiko
'; [EMAIL PROTECTED] Subject: RE: Wget 1.9-beta1 is available for testing Hrvoje, please add this patch: --- wget-1.9-beta1\windows\Makefile.src Sat May 18 02:16:36 2002 +++ wget-1.9-beta1.wip\windows\Makefile.src Thu Sep 25 08:09:26 2003 @@ -63,15 +63,17 @@ RM = del -SRC

Re: Wget 1.9-beta1 is available for testing

2003-09-25 Thread Hrvoje Niksic
Could the person who sent me the patch for Windows compilers support please resend it? Amidst all the viruses, I accidentally deleted the message before I've had a chance to apply it. Sorry about the mistake.

Re: Wget 1.9-beta1 is available for testing

2003-09-24 Thread DervishD
Hi Hrvoje :) * Hrvoje Niksic [EMAIL PROTECTED] dixit: http://fly.srk.fer.hr/~hniksic/wget/wget-1.9-beta1.tar.gz I've got and tested it, and with NO wgetrc (it happens the same with my own wgetrc, but I tested clean just in case), the problem with the quoting still exists:

Re: Wget 1.9-beta1 is available for testing

2003-09-24 Thread Hrvoje Niksic
DervishD [EMAIL PROTECTED] writes: I've got and tested it, and with NO wgetrc (it happens the same with my own wgetrc, but I tested clean just in case), the problem with the quoting still exists: $wget -r -c -nH ftp://user:[EMAIL PROTECTED]/Music/Joe Hisaishi [...] --15:22:55--

Re: Wget 1.9-beta1 is available for testing

2003-09-24 Thread DervishD
Hi Hrvoje :) * Hrvoje Niksic [EMAIL PROTECTED] dixit: Thanks for the detailed bug report. Although it doesn't look that way, this problem is nothing but a simple oversight. OK, patch applied and working! Now it downloads correctly all files, quote appropriately and all works smooth

Re: wget renaming URL/file downloaded, how to???

2003-09-18 Thread Hrvoje Niksic
Lucuk, Pete [EMAIL PROTECTED] writes: as we can see above, wget has raznoe.shtml.html as the main file, this is *not* what I want, I *always* want the main file to be name index.html. Wget doesn't really have the concept of a main file. As a workaround, you could simply `ln -s

RE: wget renaming URL/file downloaded, how to???

2003-09-18 Thread Lucuk, Pete
, September 18, 2003 12:37 PM To: Lucuk, Pete Cc: '[EMAIL PROTECTED]' Subject: Re: wget renaming URL/file downloaded, how to??? Lucuk, Pete [EMAIL PROTECTED] writes: as we can see above, wget has raznoe.shtml.html as the main file, this is *not* what I want, I *always* want the main file

Re: wget proxy support

2003-09-14 Thread Hrvoje Niksic
Nicolas, thanks for the patch; I'm about to apply it to Wget CVS.

Re: wget --spider issue

2003-09-10 Thread Aaron S. Hawley
On Wed, 10 Sep 2003, Andreas Belitz wrote: Hi, i have found a problem regarding wget --spider. It works great for any files over http or ftp, but as soon as one of these two conditions occur, wget starts downloading the file: 1. linked files (i'm not 100% sure about this) 2.

Re: wget --spider issue

2003-09-10 Thread Andreas Belitz
Hi Aaron S. Hawley, On Wed, 10. September 2003 you wrote: ASH actually, what you call download scripts are actually HTTP redirects, and ASH in this case the redirect is to an FTP server and if you double-check i ASH think you'll find Wget does not know how to spider in ftp. end ASH

Re: wget --spider issue

2003-09-10 Thread Aaron S. Hawley
On Wed, 10 Sep 2003, Andreas Belitz wrote: Hi Aaron S. Hawley, On Wed, 10. September 2003 you wrote: ASH actually, what you call download scripts are actually HTTP redirects, and ASH in this case the redirect is to an FTP server and if you double-check i ASH think you'll find Wget does

Re: wget and MM_openBrWindow

2003-09-10 Thread Adam Stein
Wget doesn't interpret Javascript, only regular HTML (AFAIK). Wget won't be able to follow any links that are only setup thru a Javascript function like MM_openBrWindow(). Adam -- Adam Stein @ Xerox Corporation Email: [EMAIL PROTECTED]

RE: wget -r -p -k -l 5 www.protcast.com doesnt pull some images t hough they are part of the HREF

2003-09-09 Thread Post, Mark K
No, it won't. The javascript stuff makes sure of that. Mark Post -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Sent: Tuesday, September 09, 2003 4:32 PM To: [EMAIL PROTECTED] Subject: wget -r -p -k -l 5 www.protcast.com doesnt pull some images though they are

Re: [SPAM?:###] RE: wget -r -p -k -l 5 www.protcast.com doesnt pull some images t hough they are part of the HREF

2003-09-09 Thread Aaron S. Hawley
I, on the other hand, am actually not sure why you're not able to have Wget find the marked up (not javascript) image. Cause it worked for me. % ls -l www.protcast.com/Grafx/menu-contact_\(off\).jpg -rw--- 1 ashawley usr 2377 Jan 10 2003

Re: wget 1.8.2 frustration

2003-08-31 Thread jeremy reeve
I believe this is on the TODO list, if I understood the problem correctly. I'm starting to look at the code and time permitting I'll try and address this one too. Cheers, Jeremy On Sat, 30 Aug 2003, Jason Mancini wrote: I'm trying to retrieve a list of files from an ftp server: wget

Re: wget and download.com

2003-08-27 Thread Ryan Underwood
Hello, On Wed, Aug 27, 2003 at 06:19:17AM -0400, joe j wrote: Hello, I am trying to download files with wget from download.com. I am using a windows system. For some reason wget doesn?t know how to deal with download.com links. For example Kazaa link

RE: wget and 2 users / passwords to get through?

2003-08-20 Thread Post, Mark K
If this is a non-transparent proxy, you do indeed need to use the proxy parameters: --proxy-user=user --proxy-passwd=password As well as set the proxy server environment variables ftp_proxy=http://proxy.server.name[:port] - Note the http:// value. That is correct.

RE: wget is mirroring whole internet instead of just my web page!

2003-08-18 Thread Post, Mark K
man wget shows: -D domain-list --domains=domain-list Set domains to be followed. domain-list is a comma-separated list of domains. Note that it does not turn on -H. Mark Post -Original Message- From: Andrzej Kasperowicz [mailto:[EMAIL PROTECTED]

RE: wget is mirroring whole internet instead of just my web page!

2003-08-18 Thread Andrzej Kasperowicz
On 18 Aug 2003 at 13:49, Post, Mark K wrote: man wget shows: -D domain-list --domains=domain-list Set domains to be followed. domain-list is a comma-separated list of domains. Note that it does not turn on -H. Right, but by default wget should not

RE: wget is mirroring whole internet instead of just my web page!

2003-08-18 Thread Post, Mark K
PROTECTED] Subject: RE: wget is mirroring whole internet instead of just my web page! On 18 Aug 2003 at 13:49, Post, Mark K wrote: man wget shows: -D domain-list --domains=domain-list Set domains to be followed. domain-list is a comma-separated list of domains

RE: Wget 1.8.2 timestamping bug

2003-08-06 Thread Post, Mark K
Angelo, It works for me: # wget -N http://www.nic.it/index.html --13:04:39-- http://www.nic.it/index.html = `index.html' Resolving www.nic.it... done. Connecting to www.nic.it[193.205.245.10]:80... connected. HTTP request sent, awaiting response... 200 OK Length: 2,474 [text/html]

RE: wget and procmail

2003-07-29 Thread Post, Mark K
Does the PATH of procmail contain the directory where wget lives? Mark Post -Original Message- From: Michel Lombart [mailto:[EMAIL PROTECTED] Sent: Tuesday, July 29, 2003 6:51 PM To: [EMAIL PROTECTED] Subject: wget and procmail Hello, I've an issue with wget and procmail. I install

Re: wget proxy support

2003-07-18 Thread Nicolas Schodet
* Thomas Schweikle [EMAIL PROTECTED] [030718 12:27]: in the manual page to wget you mention: --proxy=on/off Turn proxy support on or off. The proxy is on by default if the appropriate environmental variable is defined. Could you please state additionally what ... appropriate

RE: wget: ftp through http proxy not working with 1.8.2. It does wo rk with 1.5.3

2003-07-14 Thread Hans Deragon (QA/LMC)
Hi again. Some people have reported experiencing the same problem, but nobody from the development team has forwarded a comment on this. Anybody can tell us if this is bug or some config issue? Regards, Hans Deragon -Original Message- From: Hans Deragon (LMC) Sent: Wednesday,

RE: wget: ftp through http proxy not working with 1.8.2. It doeswork with 1.5.3

2003-07-14 Thread Aaron S. Hawley
Wget maintainer: http://www.geocrawler.com/archives/3/409/2003/3/0/10399285/ -- The geocrawler archives for Wget are alive again! On Mon, 14 Jul 2003, Hans Deragon (QA/LMC) wrote: Hi again. Some people have reported experiencing the same problem, but nobody from the development team has

RE: wget: ftp through http proxy not working with 1.8.2. It does work with 1.5.3

2003-07-14 Thread Post, Mark K
, 2003 10:21 AM To: '[EMAIL PROTECTED]' Subject: RE: wget: ftp through http proxy not working with 1.8.2. It doe s wo rk with 1.5.3 Hi again. Some people have reported experiencing the same problem, but nobody from the development team has forwarded a comment on this. Anybody can tell us

RE: wget: ftp through http proxy not working with 1.8.2. It does work with 1.5.3

2003-07-14 Thread Post, Mark K
still interested in working on wget). Mark -Original Message- From: Hans Deragon (QA/LMC) [mailto:[EMAIL PROTECTED] Sent: Monday, July 14, 2003 1:05 PM To: 'Post, Mark K' Subject: RE: wget: ftp through http proxy not working with 1.8.2. It doe s work with 1.5.3 wget --debug -m ftp

Re: wget with openssl problems

2003-07-03 Thread Toby Corkindale
On Tue, Jun 24, 2003 at 02:41:50PM -0400, Jim Ennis wrote: Hello, I am trying to compile wget-1.8.2 on Solaris 9 with openssl-0.9.7b . The Don't.. Wget is seriously broken with the SSL extensions, see my messages a month or two ago. (Not that anyone repied :P) Check out curl perhaps?

Re: wget problem

2003-07-03 Thread Tony Lewis
Rajesh wrote: Wget is not mirroring the web site properly. For eg it is not copying symbolic links from the main web server.The target directories do exist on the mirror server. wget can only mirror what can be seen from the web. Symbolic links will be treated as hard references (assuming

Re: wget problem

2003-07-03 Thread Tony Lewis
Rajesh wrote: Thanks for your reply. I have tried using the command wget --user-agent=Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1), but it didn't work. Adding the user agent helps some people -- I think most often with web servers from the evil empire. I have one more question. In

RE: wget feature requests

2003-06-17 Thread Peschko, Edward
rrgh. I see that you have an option '--spider' to just test the connection, however, it doesn't seem to work unless you place the option *before* the url. ie: wget --spider 'ftp://stuff' works wheras wget --spider 'ftp://stuff2' doesn't. no - scratch that - it *doesn't* seem to work if

RE: wget feature requests

2003-06-17 Thread Peschko, Edward
Just upgraded to 1.8.2 and ok, I think I see the problem here... --spider only works with html files.. right? If so, why? Ed

RE: wget feature requests

2003-06-17 Thread Aaron S. Hawley
i submitted a patch in february. http://www.mail-archive.com/wget%40sunsite.dk/msg04645.html http://www.geocrawler.com/archives/3/409/2003/2/100/10313375/ On Tue, 17 Jun 2003, Peschko, Edward wrote: Just upgraded to 1.8.2 and ok, I think I see the problem here... --spider only works with

Re: wget recursion options ?

2003-06-12 Thread Aaron S. Hawley
there doesn't seem to be anything wrong with the page. are you having trouble with recursive wgets with other (all) pages or just this one. /a

Re: wget recursion options ?

2003-06-12 Thread Peter Skye
Aaron S. Hawley wrote: there doesn't seem to be anything wrong with the page. are you having trouble with recursive wgets with other (all) pages or just this one. Aaron, good thinking -- I tried different urls and then different versions. It's a bug with the OS/2 wget 1.82 which I have.

Re: WGET help needed

2003-06-11 Thread Aaron S. Hawley
http://www.gnu.org/manual/wget/ On Wed, 11 Jun 2003, Support, DemoG wrote: hello, I need help on this subject: Please tell me what is the command line if i wanted to get all the files, subdirectories with all contained from a ftp like ftp.mine.com also i have the user and pass, and i will

Re: wget vs mms://*.wmv?

2003-06-04 Thread Aaron S. Hawley
another propretary protocol brought to you by the folks in redmond, washington. http://sdp.ppona.com/ http://geocities.com/majormms/ On Sun, 1 Jun 2003, Andrzej Kasperowicz wrote: How could I download using wget that: mms://mms.itvp.pl/bush_archiwum/bush.wmv If wget cannot manage it then

try 1.8.2 Re: wget fails to re-escape ampersand in URL

2003-05-29 Thread cerbo
Hi George, maybe try Wget version 1.8.2 if you can get hold of it. Using that version Wget seems to behave as expected (see below). Cheers, cerbo wget -d http://search.yahoo.com/search?p=d%26d+exhaust; DEBUG output created by Wget 1.8.2 on irix6.5. --12:37:14--

Re: Wget a Post Form

2003-03-18 Thread Aaron S. Hawley
my guess is that this probably isn't in the manual. % wget --version GNU Wget 1.9-beta Copyright (C) 1995, 1996, 1997, 1998, 2000, 2001 Free Software Foundation, Inc. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of

Re: wget --spider doesn't work for ftp URLs

2003-03-06 Thread Aaron S. Hawley
a patch was submitted: http://www.mail-archive.com/wget%40sunsite.dk/msg04645.html On Thu, 6 Mar 2003, Keith Thompson wrote: When invoked with an ftp://; URL, wget's the --spider option is silently ignored and the file is downloaded. This applies to wget version 1.8.2. To demonstrate:

Re: wget 301 redirects broken in 1.8.2

2003-02-24 Thread Aaron S. Hawley
this bug is confirmed in CVS, it looks like there's been a lot of changes to html-url.c /a On Thu, 20 Feb 2003, Jamie Zawinski wrote: Try this, watch it lose: wget --debug -e robots=off -nv -m -nH -np \ http://www.dnalounge.com/flyers/ http://www.dnalounge.com/flyers/ does a

Re: wget with Router

2003-02-16 Thread Kalin KOZHUHAROV
Dieter Kuntz wrote: I want to get the status-page of my router. the login of the router consists only of a password, no username. that is the problem, I cannot login with wget. can somebody help me? What protocol are you using?? HTTP,FTP,Telnet? How do you usually do that, not using wget?

Re: wget with Router

2003-02-16 Thread Kalin KOZHUHAROV
Dieter Kuntz wrote: hello, thanks for your answer. I am using Win98. When I go to my status-page of my smc-router 7004abr, I use a browser and write http://192.168.2.1:88. You see, I use http. Then I get the startpage with a field, where I can fill in the password and enter. then I get the

Re: wget with Router

2003-02-16 Thread Max Bowsher
Kalin KOZHUHAROV wrote: Dieter Kuntz wrote: i will test with --http-user.. OK, I think you will not make it this way. What we are talking here is form sumbmission, not just password. Your password happenes to be part of a form. So first look at the html source of the page where

RE: wget 1.8x does not mirror FTP servers using an HTTP proxy

2003-02-03 Thread Micah Hoffman
Sorry for the length of this post. Some of you offered some helpful things for me to try. First, here's my ~/.wgetrc file: $ more ~/.wgetrc continue = on dirstruct = off dot_style = mega glob = on mirror = on passive_ftp = on progress = bar tries = 3 verbose = on use_proxy = on ftp_proxy =

Re: WGet -n -nc options

2003-01-22 Thread Fred Holmes
You need -N (upper case). The switch is case sensitive. Glad to see that someone else is an f-prot user. At 12:23 AM 1/22/2003, Steve Bratsberg wrote: I am using Wget to update the date file for f-prot disks that boot from Dos. I have a copy of the zipped dat file on the hard drive and I have

Re: WGET Syntax for a URL with cgi scripts.

2003-01-08 Thread Hack Kampbjorn
Fred Holmes wrote: Many thanks for the help. I'm using Windows 32 binary 1.8.2 on WIN2K. The syntax you have given me gets the first page OK, apparently by using the -O switch to specify a simple filename as the output file, rather than the automatically generated output filename(s). I.e., I

Re: wget -m imply -np?

2002-12-30 Thread Jens Rösner
Hi Karl! From my POV, the current set-up is the best solution. Of course, I am also no developer, but an avid user. Sometimes you just don't know the structure of the website in advance, so using -m as a trouble-free no-brainer will get you the complete site neatly done with timestamps. BTW,

Re: wget: wrong file type

2002-12-03 Thread Jeremy Hetzler
At 11:24 PM 12/2/2002 +0200, Taavi Meos wrote: Hi! This bug is not serious, everything works but may confuse a user. OS: RedHat Linux 8.0 GNU Wget 1.8.2 used filename: proftpd-1.2.6-1.i386.rpm When downloading a RedHat RPM package wget shows it's type as: [audio/x-pn-realaudio-plugin] using

Re: Wget with -O and -k

2002-10-23 Thread J. Grant
Isn't nytimes subscription only? you need to set the cookie etc i think JG Jennifer Freeman wrote: Hiya, using wget with -k and -O together wget cannot change the links in the document. wget -k -O nytimes.html http://www.nytimes.com/ref/membercenter/help/privacy.html produces the following

Re: wget: relative link to non-relative

2002-10-16 Thread Daniel Webb
I've been watching this list for a couple of days, and I think it's just people asking questions (me included) with no one around to answer them. On 16 Oct 2002, hao chen wrote: Hi, Is there a way to convert relative links to non-relative links with wget? I searched through the manual and

RE: wget: relative link to non-relative

2002-10-16 Thread Franki
is nearly empty (it seems) nearly all the linux distro's I have tried use wget for all updating tasks or similiar... quiet strange really. rgds Frank -Original Message- From: Daniel Webb [mailto:[EMAIL PROTECTED]] Sent: Thursday, 17 October 2002 4:11 AM To: [EMAIL PROTECTED] Subject: Re

RE: wget: relative link to non-relative

2002-10-16 Thread Vernon, Clayton
. -Original Message- From: Franki [mailto:[EMAIL PROTECTED]] Sent: Wednesday, October 16, 2002 4:22 PM To: [EMAIL PROTECTED] Subject: RE: wget: relative link to non-relative yeah, I have been lurking here for over a week, was going to ask a question, but didn't see any answers, so I wasn't gonna

Re: wget tries to print the file prn.html

2002-09-24 Thread Andre Majorel
On 2002-09-20 08:15 +0200, Dominic Chambers wrote: I am using wget 1.82 on Win2K SP2, and wget froze on the fifth 1.8.2. downloaded file 'prn.html' using the command line: wget -r -l0 -A htm,html,png,gif,jpg,jpeg --no-parent http://java.sun.com/products/jlf/at/book About twenty

Re: wget tries to print the file prn.html

2002-09-20 Thread Thomas Lussnig
I just started using wget today, and I am very impressed with it. Apart from the first job I tried, everything has worked perfectly for a number of jobs, even changing to relative URLs, which I thought was very impressive. Thanks for the good work. I am using wget 1.82 on Win2K SP2,

Re: wget tries to print the file prn.html

2002-09-20 Thread Max Bowsher
Thomas Lussnig wrote: this is an Windows spezific problem. Normaly should prn.html an valid Filename. And as you can check Long filenames can contain :. No they can't. And, on NTFS including a : in a filename causes the data to be written into an invisible named stream. But there is an

Re: wget

2002-09-18 Thread Max Bowsher
[EMAIL PROTECTED] wrote: We're using the wget app to run our scheduled tasks. Each time it runs, a copy of the file is created with a number added to the end of it. Is there a way to turn this off? We tried adding --quiet to the bat file but it still wrote the file. -nc or -N depending on

Re: wget denied access by site even with robots off and user-agentoptions used

2002-09-17 Thread Earl Mitchell
Dale Therio wrote: Hmm... the 'facts' as you stated in your post to this list are that the site owner does not want people to suck the content of his/her site using a download manager. And to prove this, they are trying to detect such activity and are preventing it. Those are the facts

Re: wget denied access by site even with robots off and user-agent options used

2002-09-13 Thread Dale Therio
It would seem that this is not your site then and so if the owner of the site doesn't want you to download his/her entire site why should you? Maybe they have banners or something that pays for their hosting costs and they feel preventing tools like wget form sucking their site is a way to

Re: wget and asp

2002-09-13 Thread Max Bowsher
Try the source I sent you. Dominique wrote: thank you Max, np Is it different than the one I CVS-ed yesterday? I mean, does it have changes in creating filenames? Please note, that I finally compiled it and could run it. No changes... I did run autoconf, so you could go straight to

Re: wget and asp

2002-09-13 Thread Dominique
No changes... I did run autoconf, so you could go straight to configure (as you have too new an autoconf version). it compiled just fine now Something has just occurred to me: by default, wget defaults to recursion resticted to 5 levels. Perhaps that is the problem? If so, an -l0 will fix it.

Re: wget and asp

2002-09-12 Thread Dominique
The problem is that with a ?x=y, where y contains slashes, wget passes them unchanged to the OS, causing directories to be created, but fails to adjyst relative links to account for the fact that the page is in a deeper directory that it should be. The solution is to map / to _ or something.

Re: wget and asp

2002-09-12 Thread Max Bowsher
Max Bowsher wrote: The problem is that with a ?x=y, where y contains slashes, wget passes them unchanged to the OS, causing directories to be created, but fails to adjyst relative links to account for the fact that the page is in a deeper directory that it should be. The solution is to map /

Re: wget and asp

2002-09-11 Thread Thomas Lussnig
To invoke html examples they use calls like (just the first example): http://www.w3schools.com/html/tryit.asp?filename=tryhtml_basic What filename did you expect for this ? - tryit.asp - tryit.asp?filename=tryhtml_basic - tryhtml_basic Wget saves a file and a directory with this very

Re: wget and asp

2002-09-11 Thread Dominique
What filename did you expect for this ? - tryit.asp - tryit.asp?filename=tryhtml_basic - tryhtml_basic Once again: the loaction is: http://www.w3schools.com/html/tryit.asp?filename=tryhtml_basic It is a frame set which requires frames. One of them is a problem, because it has special

Re: wget hangs on ftp

2002-09-11 Thread Noel Koethe
On Mit, 11 Sep 2002, Dominique wrote: Hello, wget ftp://ftp.reed.edu/pub/src/html-helper-mode.tar.gz == PORT ... done.== RETR html-helper-mode.tar.gz ... maybe you have to use --passive-ftp? -- Noèl Köthe

Re: wget and asp

2002-09-11 Thread Max Bowsher
Dominique wrote: tryit_edit.asp?filename=tryhtml_basicreferer=http://www.w3schools.com/html/html _examples.asp and just this one is truncated. I think some regexp or pattern or explicit list of where_not_to_break_a_string characters would solve the problem. Or maybe it is already possible,

Re: wget and asp

2002-09-11 Thread Dominique
Is it something I can do myself or the code has to be changed? Domi I think that some URL encoding has not happened somewhere. Whether wget or the web server is at fault, I don't know, but the solution would be to URL encode the slashes. Max.

Re: wget and asp

2002-09-11 Thread Thomas Lussnig
tryit_edit.asp?filename=tryhtml_basicreferer=http://www.w3schools.com/html/html _examples.asp and just this one is truncated. I think some regexp or pattern or explicit list of where_not_to_break_a_string characters would solve the problem. Or maybe it is already possible, but I dont know

Re: WGET and the robots.txt file...

2002-09-11 Thread Max Bowsher
-e robots=off Jon W. Backstrom wrote: Dear Gnu Developers, We just ran into a situation where we had to spider a site of our own on a outsourced service because the company was going out of business. Because wget respects the robots.txt file, however, we could not get an archive made

Re: wget and asp

2002-09-11 Thread Max Bowsher
Thomas Lussnig wrote: Why should be there an url encoding ? / are an legal character in url and in the GET string. Ist used for example for Path2Query translation. The main problem is that wget need to translate an URL to and Filesystem name. Yes, you are right, I wasn't think clearly.

Re: wget and asp

2002-09-10 Thread Max Bowsher
You don't give a whole lot of information. It's kind of impossible to help when you don't know what the problem is. Posting the URL of the problem site would be a good idea. Max. Dominique wrote: Is it possible at all? dominique Dominique wrote: Hi, I have a problem trying to wget a

Re: wget and asp

2002-09-10 Thread Dominique
Posting the URL of the problem site would be a good idea. well, I have quite a few. let's start with this: http://www.w3schools.com/html/default.asp or just anything from such a page page. I hacked around for a while with no apparent success. thanks dominique Max. Dominique wrote:

Re: wget and asp

2002-09-10 Thread Thomas Lussnig
Dominique wrote: Posting the URL of the problem site would be a good idea. well, I have quite a few. let's start with this: http://www.w3schools.com/html/default.asp or just anything from such a page page. I hacked around for a while with no apparent success. Try this and it

Re: wget and asp

2002-09-10 Thread Dominique
Yes! It works!! I just missed -U option thanks a lot! dominique Thomas Lussnig wrote: Try this and it works !!! wget -U Mozilla/5.0 (compatible; MSIE 6.0; Windows NT 5.1) http://www.w3schools.com/html/default.asp Problem is that these site Block wget Cu Thomas Lunig

RE: WGET can't get MS Word HTML Links

2002-08-20 Thread Jenkins, Matthew
I just tested this on the UNIX release wget-1.8.2.tar.gz and still came across the same problem. WGET creates files such as "subdirectory\filename.html" in the main directory. It then tries to get the images from filename.html, but cannot, since it has not traveled down intosubdirectory.

Re: wget does not terminate with Suse kernel 2.4.10

2002-08-12 Thread Erlend Aasland
Hi, On Sun, Aug 11, 2002 at 05:26:44PM +0200, Kai Anding wrote: Hello there, I am trying to use wget with Suse 7.3 and i cannot get it to run correctly. An example would be wget ftp://ftp.suse.com/pub/suse/i386/8.0/SuSEgo.ico which produces the output . and then the program

Re: wget -r with https

2002-08-01 Thread Erlend Aasland
This was already fixed in wget-1.8.2 Regards Erlend Aasland On Thu, Aug 01, 2002 at 01:20:43PM +1000, [EMAIL PROTECTED] wrote: There is a problem with wget recursively retrieving pages over https. With debu\ g on you get the following. Not following non-HTTP schemes. A quick

Re: wget connects to a default 80 port

2002-07-29 Thread Koifman Maya
Hi and thank you for your prompt reply. However, my problem is that I do http://www.helios.de/cgi-bin/nph-trace.cgi?128.131.44.10; and what it does instead is http://www.helios.de:80/cgi-bin/nph-trace.cgi?128.131.44.10;. On Mon, 29 Jul 2002, Erlend Aasland wrote: Try wget

Re: wget and meta name=robots content=noindex,nofollow

2002-07-06 Thread Hack Kampbjørn
Cédric Rosa wrote: Hello, Is-it normal that wget saves web pages which contain meta name=robots content=noindex ? Or does wget considerate that it is not a search engine and respects only the follow/nofollow rules ? Or is-it a bug ? :) I don't think wget support meta name=robots tags.

Re: wget and meta name=robots content=noindex,nofollow

2002-07-06 Thread Cédric Rosa
://www.robotstxt.org/wc/norobots-rfc.txt. [...] Bye, Cedric. - Original Message - From: Hack Kampbjørn [EMAIL PROTECTED] To: Cédric Rosa [EMAIL PROTECTED] Cc: [EMAIL PROTECTED] Sent: Saturday, July 06, 2002 8:21 PM Subject: Re: wget and meta name=robots content=noindex,nofollow Cédric Rosa wrote

RE: wget - saving cookies to file

2002-07-02 Thread Herold Heiko
IIRC if no cookie is set no file is created. Try with wget -d, check the debug output. Heiko -- -- PREVINET S.p.A.[EMAIL PROTECTED] -- Via Ferretto, 1ph x39-041-5907073 -- I-31021 Mogliano V.to (TV) fax x39-041-5907472 -- ITALY -Original Message- From: Holger

RE: wget tries to use illegal chars in filename

2002-07-01 Thread Herold Heiko
Title: Bericht Use 1.8.2 or 1.9-beta. PREVINET S.p.A. [EMAIL PROTECTED]-- Via Ferretto, 1 ph x39-041-5907073-- I-31021 Mogliano V.to (TV) fax x39-041-5907472-- ITALY -Original Message-From: Carl S. in 't Veld [mailto:[EMAIL PROTECTED]]Sent: Sunday, June 30, 2002 1:01

Re: wget 1.8.2 - relatives URLs to absolute in stdout

2002-06-23 Thread vogue
Dear wget list, I'm really amazed by the -k option in wget, however, it doesn't appear to work with the -O - option or even the -O file option! It would be really great if I could print the converted URLs to stdout. Does anyone have a work around? Or maybe another piece of

Re: Wget 1.8.2-pre3 ready for testing

2002-06-02 Thread Doug Kaufman
On Tue, 28 May 2002, Hrvoje Niksic wrote: Doug Kaufman [EMAIL PROTECTED] writes: This doesn't work out of the box for DJGPP or for Cygwin. Appended is a patch to fix most of the problems. Thanks for the comments and the patch. The patch will most likely not make into the 1.8.2

<    4   5   6   7   8   9   10   11   12   13   >