Re: [Wget]: Bug submission

2001-12-29 Thread Hrvoje Niksic
[ Please mail bug reports to [EMAIL PROTECTED], not to me directly. ] Nuno Ponte [EMAIL PROTECTED] writes: I get a segmentation fault when invoking: wget -r http://java.sun.com/docs/books/performance/1st_edition/html/JPTOC.fm.html My Wget version is 1.7-3, the one which is

Re: Just a Question

2001-12-29 Thread Hrvoje Niksic
Edward Manukovsky [EMAIL PROTECTED] writes: Excuse me, please, but I've got a question. I cannot set retry timeout for 30 seconds by doing: wget -w30 -T600 -c -b -t0 -S -alist.log -iurl_list For me, Wget waits for 30 seconds between each retrieval. What version are you using?

Re: Bug if current folder don't existe

2001-12-29 Thread Hrvoje Niksic
Jean-Edouard BABIN [EMAIL PROTECTED] writes: I found a little bug when we download from an deleted directory: [...] Thanks for the report. I wouldn't consider it a real bug. Downloading things into a deleted directory is bound to produce all kinds of problems. The diagnostic message could

Re: recursive ftp via proxy problem in wget 1.8.1

2001-12-29 Thread Hrvoje Niksic
Jiang Wei [EMAIL PROTECTED] writes: I tried to download a whole directory in a FTP site by using `-r -np' options, and I have go through some firewall via http_proxy/ftp_proxy. But I failed, wget-1.8.1 only retrieved the first indexed ftp file list and stopped working, while wget-1.5.3 can

Re: Assertion failure in wget 1.8, recur.c:753

2001-12-29 Thread Hrvoje Niksic
Thomas Reinke [EMAIL PROTECTED] writes: Neat...not sure that I really nkown enough to start digging to easily figure out what went wrong, but it can be reproduced by running the following: $ wget -d -r -l 5 -t 1 -T 30 -o x.lg -p -s -P dir -Q 500 --limit-rate=256000 -R mpg,mpeg

How to save what I see?

2001-12-29 Thread Robin B. Lake
I'm using wget to save a tick chart of a stock index each night. wget -nH -q -O /QoI/working/CHARTS/$myday+OEX.html 'http://bigcharts.marketwatch.com/quickchart/quickchart.asp?symb=%24OEXsid=0o_symb=%24OEXx=60y=15freq=9time=1' The Web site returns an image, whose HTML is: IMG

SSL sites fail to be crawled

2001-12-29 Thread Thomas Reinke
It seems that SSL sites aren't crawled properly, because wget decides that the scheme is not to be followed. Offending code appears to be limited to only 3 lines located in recur.c: (version 1.8.1) Line 440: change to if (u-scheme != SCHEME_HTTP u-scheme!= SCHEME_HTTPS Line 449:

SSL site mirroring

2001-12-29 Thread Thomas Reinke
Ok, either I've completely misread wget, or it has a problem mirroring SSL sites. It appears that it is deciding that the https:// scheme is something that is not to be followed. For those interested, the offending code appears to be 3 lines in recur.c, which, if changed treat the HTTPS schema