Re: --disable-dns-cache

2003-09-04 Thread Max Bowsher
Mauro Tortonesi wrote: On Tue, 2 Sep 2003, Hrvoje Niksic wrote: Mauro Tortonesi [EMAIL PROTECTED] writes: On Tue, 2 Sep 2003, Jeremy Reeve wrote: I've written a trivial patch to implement the --disable-dns-cache feature as described in the TODO contained in the CVS tree. I need to write

Re: how to save the file in memory

2003-05-31 Thread Max Bowsher
niurui wrote: Hi, all I want to embed wget in my program. first my program will use wget to get a file from internet, then it will search this file for some words. So i want to save that file in memory rather than on hard disk. Can wget save the file in memory? how to? Wget is totally the

Re: Feature request: file exclude.

2003-03-23 Thread Max Bowsher
Sherwood Botsford wrote: I wish I had a file exclude option. I'm behind a firewall that doesn't allow ftp, so I have to find sites that use http for file transfer. I'm currently trying to install cygwin on my local lan network. To do that, I;m using wget to mirror the remote site locally,

Re: Not 100% rfc 1738 complience for FTP URLs = bug

2003-03-13 Thread Max Bowsher
David Balazic wrote: As I got no response on [EMAIL PROTECTED], I am resending my report here. One forwards to the other. The problem is that the wget maintainer is absent, and likely to continue to be so for several more months. As a result, wget development is effectively stalled. Max.

Re: Not 100% rfc 1738 complience for FTP URLs = bug

2003-03-13 Thread Max Bowsher
David Balazic wrote: Max Bowsher wrote: David Balazic wrote: As I got no response on [EMAIL PROTECTED], I am resending my report here. One forwards to the other. The problem is that the wget maintainer is absent, and likely to continue to be so for several more months. As a result

Re: wget with Router

2003-02-16 Thread Max Bowsher
Kalin KOZHUHAROV wrote: Dieter Kuntz wrote: i will test with --http-user.. OK, I think you will not make it this way. What we are talking here is form sumbmission, not just password. Your password happenes to be part of a form. So first look at the html source of the page where

Re: Removing files and directories not present on remote FTP server

2003-02-14 Thread Max Bowsher
Nick Earle wrote: Can someone let me know if and when this feature is going to be implemented within WGet. I am currently using WGet on a Windows platform and this is the only feature that appears to be missing from this utility. wget development seems to be at a standstill. AFAIK, there is

Re: Removing files and directories not present on remote FTP server

2003-02-14 Thread Max Bowsher
Nick Earle wrote: Can someone let me know if and when this feature is going to be implemented within WGet. If and when someone decides to implement it. But there is almost certainly not going to be another release until after Hrvoje Niksic has returned. Max.

Re: dev. of wget (was Re: Removing files and directories not present on remote FTP server

2003-02-14 Thread Max Bowsher
Aaron S. Hawley wrote: On Fri, 14 Feb 2003, Max Bowsher wrote: If and when someone decides to implement it. But there is almost certainly not going to be another release until after Hrvoje Niksic has returned. Can someone at FSF do something? [EMAIL PROTECTED], [EMAIL PROTECTED

Re: Source code download

2003-01-14 Thread Max Bowsher
Praveen wrote: Hi there, I am trying to download some asp file. It gives me the html output of the file. Is it possible to download the source code through wget. No. Max.

Re: bug or limitation of wget used to access VMS servers

2003-01-08 Thread Max Bowsher
- Original Message - From: Ken Senior [EMAIL PROTECTED] There does not seem to be support to change disks when accessing a VMS server via wget. Is this a bug or just a limitation? Wget does plain old HTTP and FTP. I know nothing about VMS. Does it have some strange syntax for discs?

Re: fetching asp file

2002-12-03 Thread Max Bowsher
deeps [EMAIL PROTECTED] wrote: I just want to see the source files for this java chat application, !?!?!?! Java is a compiled language! The source files probably aren't even on the webserver. Max.

Re: mirroring question

2002-11-01 Thread Max Bowsher
DennisBagley [EMAIL PROTECTED] wrote: ok - am using wget to mirror an ftp site [duh] and would like it not only to keep an up to date copy of the files [which it does beautifuly] but also remove files that are no-longer on the ftp server ?? Is this possible ??? Use a perl script. Max.

Re: meta crash bug

2002-10-19 Thread Max Bowsher
It's already fixed in CVS for 1.9 Max. Ivan A. Bolsunov [EMAIL PROTECTED] wrote: version: 1.8.1 in file: html-url.c in function: tag_handle_meta() { ... skipped ... char *p, *refresh = find_attr (tag, content, attrind); int timeout = 0; for (p = refresh; ISDIGIT

Re: Can't use wildcard

2002-10-19 Thread Max Bowsher
ROMNEY Jean-Francois [EMAIL PROTECTED] wrote: I can't download files with wget 1.8.1 by using wildcards. Without wildcards, it works. The option --glob=on seems to have no effect. The command is : wget -d --glob==on -nc ftp://--:---;ftpcontent.mediapps.com/francais/journal/eco/*.jpg

Re: Can't use wildcard

2002-10-18 Thread Max Bowsher
ROMNEY Jean-Francois [EMAIL PROTECTED] wrote: I can't download files with wget 1.8.1 by using wildcards. Without wildcards, it works. The option --glob=on seems to have no effect. The command is : wget -d --glob==on -nc ftp://--:---;ftpcontent.mediapps.com/francais/journal/eco/*.jpg

Re: Can't use wildcard

2002-10-17 Thread Max Bowsher
ROMNEY Jean-Francois [EMAIL PROTECTED] wrote: There is the same error with only one = What do you meen with Escape/quote ? Read the documentation for your shell. wget never sees the * beacuse the shell has already globbed with it. Also, please keep discussions on-list. Max.

Re: Recursive download Problem.

2002-10-04 Thread Max Bowsher
Yun MO wrote: Dear Ma'am/sir, I could not get all files with "wget -r" command for following address. Would you help me? Thank you in advance. M.Y. --- meta NAME="robots" CONTENT="noindex,nofollow" Wget is obeying the robots instruction. wget -e robots=off ...

Re: wget tries to print the file prn.html

2002-09-20 Thread Max Bowsher
Thomas Lussnig wrote: this is an Windows spezific problem. Normaly should prn.html an valid Filename. And as you can check Long filenames can contain :. No they can't. And, on NTFS including a : in a filename causes the data to be written into an invisible named stream. But there is an

Re: wget

2002-09-18 Thread Max Bowsher
[EMAIL PROTECTED] wrote: We're using the wget app to run our scheduled tasks. Each time it runs, a copy of the file is created with a number added to the end of it. Is there a way to turn this off? We tried adding --quiet to the bat file but it still wrote the file. -nc or -N depending on

Re: wget and asp

2002-09-13 Thread Max Bowsher
Try the source I sent you. Dominique wrote: thank you Max, np Is it different than the one I CVS-ed yesterday? I mean, does it have changes in creating filenames? Please note, that I finally compiled it and could run it. No changes... I did run autoconf, so you could go straight to

Re: wget and asp

2002-09-12 Thread Max Bowsher
Max Bowsher wrote: The problem is that with a ?x=y, where y contains slashes, wget passes them unchanged to the OS, causing directories to be created, but fails to adjyst relative links to account for the fact that the page is in a deeper directory that it should be. The solution is to map

Suggestion: Anonymous rsync access to the wget CVS tree.

2002-09-12 Thread Max Bowsher
As a dial-up user, I find it extremely useful to have access to the full range of cvs functionality whilst offline. Some other projects provide read-only rsync access to the CVS repository, which allows a local copy of the repository to be made, not just a checkout of a particular version.

Re: wget and asp

2002-09-11 Thread Max Bowsher
Dominique wrote: tryit_edit.asp?filename=tryhtml_basicreferer=http://www.w3schools.com/html/html _examples.asp and just this one is truncated. I think some regexp or pattern or explicit list of where_not_to_break_a_string characters would solve the problem. Or maybe it is already possible,

Re: WGET and the robots.txt file...

2002-09-11 Thread Max Bowsher
-e robots=off Jon W. Backstrom wrote: Dear Gnu Developers, We just ran into a situation where we had to spider a site of our own on a outsourced service because the company was going out of business. Because wget respects the robots.txt file, however, we could not get an archive made

Re: wget and asp

2002-09-11 Thread Max Bowsher
Thomas Lussnig wrote: Why should be there an url encoding ? / are an legal character in url and in the GET string. Ist used for example for Path2Query translation. The main problem is that wget need to translate an URL to and Filesystem name. Yes, you are right, I wasn't think clearly.

Re: wget and asp

2002-09-10 Thread Max Bowsher
You don't give a whole lot of information. It's kind of impossible to help when you don't know what the problem is. Posting the URL of the problem site would be a good idea. Max. Dominique wrote: Is it possible at all? dominique Dominique wrote: Hi, I have a problem trying to wget a

Re: getting the correct links

2002-08-29 Thread Max Bowsher
Christopher Stone wrote: Thank you all. Now the issue seems to be that it only gets the root directory. I ran 'wget -km -nd http://www.mywebsite.com -r Max.

Re: getting the correct links

2002-08-29 Thread Max Bowsher
Jens Rösner wrote: Hi! Max' hint is incorrect I think, as -m includes -N (timestamps) and -r (recursive) Ooops, you're right. I tend not to use -m much myself. I should pay more attention! Max.

Re: getting the correct links

2002-08-28 Thread Max Bowsher
Christopher Stone wrote: When I ran wget and sucked the site to my local box, it pulled all the pages down and the index page comes up fine, but when I click on a link, it goes back to the remote server. What switch(s) do I use, so that when I pull the pages to my box, that all of the

Re: ? about Servlets

2002-08-26 Thread Max Bowsher
Vernon, Clayton wrote: I'm having all kinds of difficulties downloading from servlets. Here is a script for a URL that works in a browser to donload a ZIP file but doesn't work via Wget http://oasis.caiso.com/servlet/SingleZip?nresultformat=5squeryname=SLD_LOAD

Re: not downloading everything with --mirror

2002-08-15 Thread Max Bowsher
HTTP does not provide a dirlist command, so wget parses html to find other files it should download. Note: HTML not XML. I suspect that is the problem. Max. Funk Gabor wrote: I recently found that during a (wget) mirror, not all the files are downloaded. (wget v1.8.2 / debian) For example:

Wget Bug: Re: not downloading everything with --mirror

2002-08-15 Thread Max Bowsher
Funk Gabor wrote: HTTP does not provide a dirlist command, so wget parses html to find other files it should download. Note: HTML not XML. I suspect that is the problem. If wget wouldn't download the rest, I'd say that too. But 1st the dir gets created, the xml is dloaded (in some other

Suggestion: Anonymous rsync access to the CVS tree.

2002-08-15 Thread Max Bowsher
As a dial-up user, I find it extremely useful to have access to the full range of cvs functionality whilst offline. Some other projects provide read-only rsync access to the CVS repository, which allows a local copy of the repository to be made, not just a checkout of a particular version. Since

Re: [BUG] assert test msecs

2002-08-01 Thread Max Bowsher
Hartwig, Thomas wrote: I got a assert exit of wget in retr.c in the function calc_rate because msecs is 0 or lesser than 0 (in spare cases). I don't know how perhaps because I have a big line to the server or the wrong OS. To get worked with this I patched retr.c setting msecs = 1 if equal