Quoting Dan Harkless ([EMAIL PROTECTED]):
the file's size). This feature would enable the writing of cool scripts to
do something like multi-threaded retrieval at file level.
[...]
Hi, Alec. You're the second person within a few days to ask for such a
feature. I've added it to the
Philipp Thomas [EMAIL PROTECTED] writes:
* Hrvoje Niksic ([EMAIL PROTECTED]) [20010305 19:30]:
you leave LC_CTYPE at the default, "C" locale, gettext converts
eight-bit characters to question marks.
What should it do? characters 127 are undefined in LC_CTYPE for
the "C" locale. So
Philipp Thomas [EMAIL PROTECTED] writes:
Ooops, yes my fingers were a bit too fast :-) Here they are, both
safe-ctype.h and safe-ctype.c.
They look good to me. The only thing I don't get is this check:
#ifdef isalpha
#error "safe-ctype.h and ctype.h may not be used simultaneously"
#else
* Hrvoje Niksic ([EMAIL PROTECTED]) [20010306 10:35]:
#ifdef isalpha
#error "safe-ctype.h and ctype.h may not be used simultaneously"
#else
Is the error statement actually true, or is this only a warning that
tries to enforce consistency of the application?
The error
I'm confused. I thought 1.5.3 *did* display the dots, but I could be wrong.
Please send queries like this to the list ( [EMAIL PROTECTED] ), not to me
personally.
--
Csaba Rduly, Software Engineer Sophos Anti-Virus
email: [EMAIL PROTECTED]
On Tue, Mar 06, 2001 at 11:28:04AM +, [EMAIL PROTECTED] wrote:
I'm confused. I thought 1.5.3 *did* display the dots, but I could be wrong.
It does here:-
1600K - .. .. .. .. .. [ 95%]
1650K - .. .. .. ..
* Hrvoje Niksic ([EMAIL PROTECTED]) [20010306 11:21]:
It is true that old systems use Gcc, but I wonder if anyone tests
*new* Gcc's on old these old systems...
Yes, they do. The patches to make gcc build on the original BSD are only
present in the current CVS GCC.
Philipp
--
Penguins shall
Philipp Thomas [EMAIL PROTECTED] writes:
* Hrvoje Niksic ([EMAIL PROTECTED]) [20010306 11:21]:
It is true that old systems use Gcc, but I wonder if anyone tests
*new* Gcc's on old these old systems...
Yes, they do. The patches to make gcc build on the original BSD are
only present
hi
I know my question was answered on this list some months ago, but
couldn't find it anymore.
How is it now with recursive downloading css and js files
tia
i.t
-- http://it97.dyn.dhs.org --
IrmundThum
hi!
this is a (crazy) idea but it could be usefull (more or less)
it is to make wget add lines like:
start-time end-time size status url
to a `central' log after downloading of each file...
`status' can be used to determine if it is ok, timeout
closed connection, etc. `central-log'
hi!
`wget -c file'
starts to download file from the begining if the file
is completely downloaded already...
why?!
I expect wget to do nothing in this case: I wanted it
to download file to the end (i.e. to continue, -c) and
if the file is already here so there is nothing to do.
* Hrvoje Niksic ([EMAIL PROTECTED]) [20010306 14:09]:
OK, then the #error stays. If noone objects, I'll modify Wget to use
these files.
I have the patches ready and and am about to test them. So if you wait a
bit, you'll get patches ready to apply.
Philipp
--
Penguins shall save
Hi,
I mirrored a website to my computer using:
wget -k -r -tinf -np URL
Later I noticed that some of the files were missing. So I decided to run:
wget -k -r -tinf -nc -np URL
Which in my opion should do the job of looking at alle pages and retrieving
image files as needed. However, this only
Jan Prikryl [EMAIL PROTECTED] writes:
Quoting Dan Harkless ([EMAIL PROTECTED]):
the file's size). This feature would enable the writing of cool
scripts to do something like multi-threaded retrieval at file level.
[...]
Hi, Alec. You're the second person within a few days to ask
Irmund Thum [EMAIL PROTECTED] writes:
hi
I know my question was answered on this list some months ago, but
couldn't find it anymore.
How is it now with recursive downloading css and js files
It works in 1.6, which you can find in the usual mirrors of
ftp://ftp.gnu.org/pub/gnu/wget/.
Vladi Belperchinov-Shabanski [EMAIL PROTECTED] writes:
hi!
`wget -c file'
starts to download file from the begining if the file
is completely downloaded already...
why?!
I expect wget to do nothing in this case: I wanted it
to download file to the end (i.e. to
Sebastian Bossung [EMAIL PROTECTED] writes:
Hi,
I mirrored a website to my computer using:
wget -k -r -tinf -np URL
Later I noticed that some of the files were missing. So I decided to run:
wget -k -r -tinf -nc -np URL
Which in my opion should do the job of looking at alle pages and
Sebastian Bossung [EMAIL PROTECTED] writes:
Hi Dan,
will the -p option in wget 1.6 look at each .html file that is already on
my hard disk to see if anything needed for it is missing? 1.5.3 does not
seem to do this (I am only talking about images here - no css or the like)
It will, but
Hi again,
I am now using wget 1.6 with -p. Seems to work, also I won't be able to tell
for sure before tomorrow moring :-). I also noticed that the server is
usually pretty fast, but appears to "hang" from time to time (webbrowser
timing out). This might have been a problem on the first run,
19 matches
Mail list logo