Re: for the wishlist

2001-03-06 Thread Jan Prikryl
Quoting Dan Harkless ([EMAIL PROTECTED]): the file's size). This feature would enable the writing of cool scripts to do something like multi-threaded retrieval at file level. [...] Hi, Alec. You're the second person within a few days to ask for such a feature. I've added it to the

Re: Wget and i18n

2001-03-06 Thread Hrvoje Niksic
Philipp Thomas [EMAIL PROTECTED] writes: * Hrvoje Niksic ([EMAIL PROTECTED]) [20010305 19:30]: you leave LC_CTYPE at the default, "C" locale, gettext converts eight-bit characters to question marks. What should it do? characters 127 are undefined in LC_CTYPE for the "C" locale. So

Re: Wget and i18n

2001-03-06 Thread Hrvoje Niksic
Philipp Thomas [EMAIL PROTECTED] writes: Ooops, yes my fingers were a bit too fast :-) Here they are, both safe-ctype.h and safe-ctype.c. They look good to me. The only thing I don't get is this check: #ifdef isalpha #error "safe-ctype.h and ctype.h may not be used simultaneously" #else

Re: Wget and i18n

2001-03-06 Thread Philipp Thomas
* Hrvoje Niksic ([EMAIL PROTECTED]) [20010306 10:35]: #ifdef isalpha #error "safe-ctype.h and ctype.h may not be used simultaneously" #else Is the error statement actually true, or is this only a warning that tries to enforce consistency of the application? The error

Re: Wget

2001-03-06 Thread csaba . raduly
I'm confused. I thought 1.5.3 *did* display the dots, but I could be wrong. Please send queries like this to the list ( [EMAIL PROTECTED] ), not to me personally. -- Csaba Rduly, Software Engineer Sophos Anti-Virus email: [EMAIL PROTECTED]

Re: Wget

2001-03-06 Thread John Poltorak
On Tue, Mar 06, 2001 at 11:28:04AM +, [EMAIL PROTECTED] wrote: I'm confused. I thought 1.5.3 *did* display the dots, but I could be wrong. It does here:- 1600K - .. .. .. .. .. [ 95%] 1650K - .. .. .. ..

Re: Wget and i18n

2001-03-06 Thread Philipp Thomas
* Hrvoje Niksic ([EMAIL PROTECTED]) [20010306 11:21]: It is true that old systems use Gcc, but I wonder if anyone tests *new* Gcc's on old these old systems... Yes, they do. The patches to make gcc build on the original BSD are only present in the current CVS GCC. Philipp -- Penguins shall

Re: Wget and i18n

2001-03-06 Thread Hrvoje Niksic
Philipp Thomas [EMAIL PROTECTED] writes: * Hrvoje Niksic ([EMAIL PROTECTED]) [20010306 11:21]: It is true that old systems use Gcc, but I wonder if anyone tests *new* Gcc's on old these old systems... Yes, they do. The patches to make gcc build on the original BSD are only present

css js

2001-03-06 Thread Irmund Thum
hi I know my question was answered on this list some months ago, but couldn't find it anymore. How is it now with recursive downloading css and js files tia i.t -- http://it97.dyn.dhs.org -- IrmundThum

fancy logs

2001-03-06 Thread Vladi Belperchinov-Shabanski
hi! this is a (crazy) idea but it could be usefull (more or less) it is to make wget add lines like: start-time end-time size status url to a `central' log after downloading of each file... `status' can be used to determine if it is ok, timeout closed connection, etc. `central-log'

-c question

2001-03-06 Thread Vladi Belperchinov-Shabanski
hi! `wget -c file' starts to download file from the begining if the file is completely downloaded already... why?! I expect wget to do nothing in this case: I wanted it to download file to the end (i.e. to continue, -c) and if the file is already here so there is nothing to do.

Re: Wget and i18n

2001-03-06 Thread Philipp Thomas
* Hrvoje Niksic ([EMAIL PROTECTED]) [20010306 14:09]: OK, then the #error stays. If noone objects, I'll modify Wget to use these files. I have the patches ready and and am about to test them. So if you wait a bit, you'll get patches ready to apply. Philipp -- Penguins shall save

retrieving images referenced from existing file

2001-03-06 Thread Sebastian Bossung
Hi, I mirrored a website to my computer using: wget -k -r -tinf -np URL Later I noticed that some of the files were missing. So I decided to run: wget -k -r -tinf -nc -np URL Which in my opion should do the job of looking at alle pages and retrieving image files as needed. However, this only

Re: for the wishlist

2001-03-06 Thread Dan Harkless
Jan Prikryl [EMAIL PROTECTED] writes: Quoting Dan Harkless ([EMAIL PROTECTED]): the file's size). This feature would enable the writing of cool scripts to do something like multi-threaded retrieval at file level. [...] Hi, Alec. You're the second person within a few days to ask

Re: css js

2001-03-06 Thread Dan Harkless
Irmund Thum [EMAIL PROTECTED] writes: hi I know my question was answered on this list some months ago, but couldn't find it anymore. How is it now with recursive downloading css and js files It works in 1.6, which you can find in the usual mirrors of ftp://ftp.gnu.org/pub/gnu/wget/.

Re: -c question

2001-03-06 Thread Dan Harkless
Vladi Belperchinov-Shabanski [EMAIL PROTECTED] writes: hi! `wget -c file' starts to download file from the begining if the file is completely downloaded already... why?! I expect wget to do nothing in this case: I wanted it to download file to the end (i.e. to

Re: retrieving images referenced from existing file

2001-03-06 Thread Dan Harkless
Sebastian Bossung [EMAIL PROTECTED] writes: Hi, I mirrored a website to my computer using: wget -k -r -tinf -np URL Later I noticed that some of the files were missing. So I decided to run: wget -k -r -tinf -nc -np URL Which in my opion should do the job of looking at alle pages and

Re: retrieving images referenced from existing file

2001-03-06 Thread Dan Harkless
Sebastian Bossung [EMAIL PROTECTED] writes: Hi Dan, will the -p option in wget 1.6 look at each .html file that is already on my hard disk to see if anything needed for it is missing? 1.5.3 does not seem to do this (I am only talking about images here - no css or the like) It will, but

Re: retrieving images referenced from existing file

2001-03-06 Thread Sebastian Bossung
Hi again, I am now using wget 1.6 with -p. Seems to work, also I won't be able to tell for sure before tomorrow moring :-). I also noticed that the server is usually pretty fast, but appears to "hang" from time to time (webbrowser timing out). This might have been a problem on the first run,