Re: Bug in ETA code on x64

2006-03-31 Thread Greg Hurrell

El 29/03/2006, a las 14:39, Hrvoje Niksic escribió:



I can't see any good reason to use "," here. Why not write the line
as:
  eta_hrs = eta / 3600; eta %= 3600;


Because that's not equivalent.


Well, it should be, because the comma operator has lower precedence
than the assignment operator (see http://tinyurl.com/evo5a,
http://tinyurl.com/ff4pp and numerous other locations).


Indeed you are right. So:

eta_hrs = eta / 3600, eta %= 3600;

Is equivalent to the following (with explicit parentheses to make the  
effect of the precendence obvious):


(eta_hrs = eta / 3600), (eta %= 3600);

Or of course:

eta_hrs = eta / 3600; eta %= 3600;

Greg



smime.p7s
Description: S/MIME cryptographic signature


Re: Bug in ETA code on x64

2006-03-29 Thread Greg Hurrell

El 28/03/2006, a las 20:43, Tony Lewis escribió:


Hrvoje Niksic wrote:


The cast to int looks like someone was trying to remove a warning and
botched operator precedence in the process.


I can't see any good reason to use "," here. Why not write the line  
as:

  eta_hrs = eta / 3600; eta %= 3600;


Because that's not equivalent. "The sequence or comma operator , has  
two operands: first the left operand is evaluated, then the right.  
The result has the type and value of the right operand. Note that a  
command in a list of initializations or arguments is not an operator,  
but simply a punctuation mark!".


Cheers,
Greg




smime.p7s
Description: S/MIME cryptographic signature


Re: New version of LFS patch

2005-02-20 Thread Greg Hurrell
El 20/02/2005, a las 16:52, Hrvoje Niksic escribió:
Greg Hurrell <[EMAIL PROTECTED]> writes:
Is this a patch against the current CVS version, or against 1.9.1?
It's against the current CVS version, sorry for having forgotten to
point it out.  Perhaps you've forgotten to run cvs update?
No. Must be some other weirdness on my system. This is on a freshly 
checked-out copy:

$ cvs -d :pserver:[EMAIL PROTECTED]:/pack/anoncvs login
(Logging in to [EMAIL PROTECTED])
CVS password:
$ cvs -d :pserver:[EMAIL PROTECTED]:/pack/anoncvs co wget
cvs server: Updating wget
U wget/AUTHORS
U wget/COPYING
[lots of CVS output]
U wget/windows/config.h.ms
U wget/windows/wget.dep
$ cd wget
$ patch -p0 < ../LFS-patch
patching file configure.in
patching file src/ftp-basic.c
patching file src/ftp-ls.c
Hunk #1 FAILED at 213.
Hunk #2 FAILED at 368.
Hunk #3 FAILED at 517.
Hunk #4 FAILED at 527.
[lots more of the same]
No idea what's wrong here... Perhaps it's just not my destiny to have 
the patch apply! ;-)

Cheers,
Greg


Re: New version of LFS patch

2005-02-20 Thread Greg Hurrell
El 20/02/2005, a las 15:02, Hrvoje Niksic escribió:
Here is the new version of the patch, with print_*_number_to_string
replaced with a single number_to_static_string, which does the buffer
ring stunt we discussed (and has a more accurate name to boot).
Please try it out and let me know if it works for you.
Is this a patch against the current CVS version, or against 1.9.1? I 
get a whole stack of hunk FAILED warnings when I try to apply the patch 
to either one (just using "patch -p0 

patching file configure.in
patching file src/ftp-basic.c
patching file src/ftp-ls.c
Hunk #1 FAILED at 213.
Hunk #2 FAILED at 368.
Hunk #3 FAILED at 517.
Hunk #4 FAILED at 527.
4 out of 4 hunks FAILED -- saving rejects to file src/ftp-ls.c.rej
patching file src/ftp.c
Hunk #4 FAILED at 770.
Hunk #6 FAILED at 1263.
Hunk #8 succeeded at 1482 with fuzz 1.
Hunk #9 FAILED at 1844.
3 out of 9 hunks FAILED -- saving rejects to file src/ftp.c.rej
Greg


Re: Finding information on wget

2005-01-11 Thread Greg Hurrell
El 11/01/2005, a las 8:31, Tony Lewis escribió:
I would argue that instead of the table of contents there should be a
navigation sidebar with a structure something like the following:
Great ideas, Greg. I'm sure if someone (you, perhaps? ... hint, hint)
created the HTML page(s) for what you've suggested, Mauro would figure 
out a
way to get them posted.
I would be happy to do that after hearing comments from on the list.
What I'm proposing is a conservative change. Basically, I would use CSS 
to do the layout and transform the table of contents into the 
navigation sidebar. Those with CSS-capable browsers would see the 
sidebar, and those without would see the site as it looks today (with 
the table of contents at the top).

Greg


Finding information on wget

2005-01-10 Thread Greg Hurrell
El 10/01/2005, a las 20:43, Mauro Tortonesi escribió:
(Yes, there are no doubt some web-based archives of the mailing list, 
such
as , but there's no 
single
official archive linked to from the official wget page or in the wget
documentation.)
well, the official wget page:
http://www.gnu.org/software/wget
reports the following URLs:
http://fly.cc.fer.hr/archive/wget
http://www.mail-archive.com/wget%40sunsite.dk/
http://www.geocrawler.com/archives/3/409/
as mailing list archives. perhaps you're suggesting we should choose 
one of
them as the "official" one?
I guess the problem is that none of them are official are they?, and 
there's no guarantee that they'll continue to be provided. The 
information is out there, but it can be hard to find. In fact, I think 
information about wget is generally hard to find. Consider that there's 
the official wget web page, then the page at  
which could easily be mistaken for the official page too. The mailing 
list archives are on three unrelated domains. The new bug tracker is at 
 but I only found that using 
Google. The older, unofficial bug tracker at 
 might still turn up for some people 
searching at Google, too. That's a lot of domains.

this is my fault. i tested the code under linux and it worked well. i 
didn't
have any other platform to test it so i did what developers are 
supposed to
do in this case: just commit the code and wait for someone like you to 
send a
bug report ;-)
Fair enough. Unfortunately I didn't know enough about using CVS from 
the command line to find out exactly what had changed. I would have 
liked to look at the repository using cvsweb, but I only discovered 
 (found via Google) while 
doing the research for this email I'm writing now. I then went back to 
the wget website and found .

Now, it's not that I am a stupid person, or that I refuse to read. On 
the contrary, I have tried to read the wget website, but it just 
doesn't lend itself to being read. It's easy to overlook things. 
Usability studies show that humans scan and read websites in a 
different way than they read books.

Compare the wget website to some other open source projects like GCC 
() and Subversion 
(). These sites have a navigation bar on 
the side, and that's what the wget site should have too, I would argue. 
The closest thing the wget website has is its table of contents:

 * Introduction to GNU wget
 * News
 * Downloading GNU wget
 * The GNU wget FAQ
 * Documentation
 * Mailing lists
 * Request an Enhancement
 * Report a Bug
 * Add-ons
 * Development of GNU wget
 * Maintainer
I would argue that instead of the table of contents there should be a 
navigation sidebar with a structure something like the following:

 * About GNU wget
 * News
 * Download
 --> Mirrors
 --> Binaries
 --> CVS
 * Support
 --> FAQ
 --> Documentation
 --> Mailing lists
 * Contributing
 --> Request an enhancement
 --> Report a bug
 --> Developer's guide
 --> CVS access
 * External links
 --> Add-ons
That's just a rough idea, but it would bring it in line with what 
people expect based on the vast majority of websites out there.

As it is, things are very confusing. The wget project is spread out 
over half a dozen unrelated domains and even more subdomains. There's a 
"discussion" list and a patches list. Users wishing to report a bug are 
told to post it to [EMAIL PROTECTED], but that just forwards to the 
"discussion" list (why aren't users told to use bug tracker?). Users 
wishing to request an enhancement are told to post it to the discussion 
list (strictly speaking, enhancement request belong in the bug tracker 
too, don't they?).

I know that your principal concern right now Mauro is working on the 
code, so please don't take all these comments about the wget website as 
demands that you do something about it. I merely wanted to put these 
thoughts on the record and hear what other people think about them. I 
would like to see all of this disparate stuff cleaned up and put on a 
single, easy to navigate server. It would be useful to know some kind 
of usage statistics (traffic, number of users) so as to know what kind 
of sponsorship would be needed to set that kind of thing up.

i'll fix this problem ASAP. anyway, i suddenly realize string.h is 
probably a
very poor choice for a filename to be included in wget.
For the record, strings.h would probably be a bad choice also. That is 
also the name of a system header file on Mac OS X.

-r--r--r--  1 root  wheel  4886 14 Sep  2003 /usr/include/string.h
-r--r--r--  1 root  wheel  2874 14 Sep  2003 /usr/include/strings.h
On FreeBSD (4.x, at least) those files are at:
-rw-r--r--  1 root  wheel   4208 Dec 25  2001 /usr/src/include/string.h
-rw-r--r--  1 root  wheel   1895 May 24  19

Large File Support, and broken CVS on Mac OS X

2005-01-10 Thread Greg Hurrell
El 10/01/2005, a las 5:07, Mark Wiebe escribió:
This occurred after many hours of downloading.  At 2GB, the bytes  
downloaded started reporting negative numbers, but it still appeared  
to be working.
This is the third post to this list in three days about (lack of)  
large-file support in wget. The issue is addressed in the FAQ:

http://www.gnu.org/software/wget/faq.html#3.1
A lot of people obviously don't read the FAQ, though. And  
unfortunately, there's no easy way to search the mailing list archives  
before posting. If there were, I suspect that searches for things like  
"DVD", "GB", and "large file" would turn up hundreds of hits. (Yes,  
there are no doubt some web-based archives of the mailing list, such as  
, but there's no single  
official archive linked to from the official wget page or in the wget  
documentation.)

It's great that some people have already come up with platform-specific  
patches to address the shortcoming. Unfortunately, it seems that it's  
quite a bit harder to do it in a cross-platform manner, hence the delay  
in releasing an update. I have zero knowledge of other platforms, but I  
do at least have a Mac OS X box in front of me so I am going to post  
some comments about these patches on Mac OS X. I suspect there might be  
some commonality between Mac OS X and the other BSD-based operating  
systems on which wget runs.

[Yes, yes, curl has support for large files. But the version of curl  
that comes with Mac OS X (even the latest released version, 10.3.7) is  
old (7.10.2) and therefore doesn't have large file support. I have  
tested the latest curl release (built from source) with large files and  
it seems to work without any config tweaks.]

I know all these patches were most likely written for and tested on  
Linux, but I thought I'd give it a try anyway. I wanted to try all  
three patches that I've see so far (the one by Leonid Petrov available  
from ; the one by Alvaro Lopez  
Ortega from  
;  
and the one by Dmitry Antipov which I didn't find for download anywhere  
but which was attached to a post to this list) but the current CVS  
version of wget is broken on Mac OS X. Don't know if it is on any other  
platforms. Specifically, the build fails during the "make" in the  
following way:

cd . && autoheader
echo timestamp > ./stamp-h.in
cd src && make CC='gcc' CPPFLAGS='' DEFS='-DHAVE_CONFIG_H  
-DSYSTEM_WGETRC=\"/usr/local/etc/wgetrc\"  
-DLOCALEDIR=\"/usr/local/share/locale\"' CFLAGS='-O2 -Wall  
-Wno-implicit' LDFLAGS='' LIBS='-lssl -lcrypto -ldl '  
prefix='/usr/local' exec_prefix='/usr/local' bindir='/usr/local/bin'  
infodir='/usr/local/info' mandir='/usr/local/man' manext='1'
/Users/ghurrell/tmp/dmitry/wget/src
gcc -I. -I.-DHAVE_CONFIG_H  
-DSYSTEM_WGETRC=\"/usr/local/etc/wgetrc\"  
-DLOCALEDIR=\"/usr/local/share/locale\" -O2 -Wall -Wno-implicit -c  
cmpt.c
In file included from cmpt.c:35:
string.h:35: error: parse error before "PARAMS"
make[1]: *** [cmpt.o] Error 1
make: *** [src] Error 2

I don't know why, but some kind of change has been made which causes  
gcc to use wget's "src/string.h" file instead of the system string.h  
file (at /usr/include/string.h). As a quick diagnostic kludge, I  
deleted the wget src/string.h file and confirmed that the build then  
completed without errors. The resulting built wget runs, but I didn't  
have any confidence in it so I didn't test it nor was I going to apply  
these patches to a broken CVS tree. The last time I pulled the source  
from CVS and built it on Mac OS X without problems was 21 November  
2004.

In any case, even though I couldn't test those patches on Mac OS X, I  
did take a good look at them (with my novice's eye!) Here are some  
notes, for what it's worth:

1. Mac OS X does have strtoll.
2. It also has "off_t" type, which is defined as a 64 bit signed  
integer ("int64_t", which in turn is defined as long long).

3. The printf family of functions has support for format strings like  
"%llu".

4. There is one potential problem that I can see with the patches by  
Alvaro and Leonid when applied to Mac OS X. In utils.c there is a call  
to "ftell", expecting an off_t return value, but the function prototype  
declares that it returns a long return value:

long ftell(FILE *stream);
There is another function, ftello, which does what you want:
off_t ftello(FILE *stream);
5. The only other problem I could see was in the patch by Dmitry, where  
he passes "O_LARGEFILE" as an option to open, but that option isn't  
documented on Mac OS X.

And that's where my technical knowledge runs out! Hope the information  
is useful to people who are more knowledgeable than me, but might not  
have access to a Mac OS X box!

Greg


Re: new string module

2005-01-05 Thread Greg Hurrell
El 05/01/2005, a las 2:46, Jan Minar escribió:
Indeed, there's no point in not trusting other parts of the program
(apart from robustness, sometimes).  I think I've heard this one
somewhere, and I have to repeat: there's no difference between the .po
files and the .h or .c files:  It's all just different ways of
programming.  You would have to rewrite gettext to make some security
boundary between the C code and the translated strings.
I meant any input coming from an untrusted source such as a different
user on the same system, or anything fetched from a network (be it a
genuine server response, or some MiM-injected crap). -- But this is a
basic security concept.
I would argue that even input coming from the *same* user should be 
sanitizied. The user doesn't have to be malicious, but they could 
accidentally (for any number of reasons, from any number of sources) 
pass garbage input to wget and cause it to crash, which looks bad. 
Basically the "circle of trust" should be defined as the boundary 
between the program itself and *anything* outside of it.

Just my opinion.
Cheers,
Greg


Re: wget doesn't download error documents

2004-12-16 Thread Greg Hurrell
El 15/12/2004, a las 23:38, Volker Kuhlmann escribió:
wget http://www.digikitten.com/k.txt
--11:31:48--  http://www.digikitten.com/k.txt
   => `www.digikitten.com/k.txt'
Resolving www.digikitten.com... 67.18.43.88
Connecting to www.digikitten.com|67.18.43.88|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
11:31:48 ERROR 404: Not Found.
Viewed in a browser with javascript enabled, it's quite funny...
Sounds like the correct behaviour to me.

Re: wget: Arbitrary file overwriting/appending/creating and other vulnerabilities

2004-12-10 Thread Greg Hurrell
El 09/12/2004, a las 10:14, Jan Minar escribió:
(0) Wget authors are/were incompetent.  Everything else is a corollary.
That's a very aggressive stance to take, and not likely to be 
productive. Patches, for example, would be more productive.

	-- Mauro Tortonesi in a private mail exchange with me
And did you ask Mauro for his permission before disclosing the contents 
of his private correspondence with you? I am doubtful that he would 
appreciate your disclosure, given that you then used it to attack the 
project of which he is the maintainer.


Re: --timestamping underdocumented

2004-11-16 Thread Greg Hurrell
El 16/11/2004, a las 12:29, Martin MOKREJŠ escribió:
See current manpage:
  -m
  --mirror
  Turn on options suitable for mirroring.  This option turns 
on recursion and time-stamping, sets
  infinite recursion depth and keeps FTP directory listings.  
It is currently equivalent to -r -N
  -l inf -nr.

Isn't --timestamping missing in this equivalence list? ... Oh, I see, 
-N is --timestamping.
Grr, I belive it should be mentioned in words here for clarity.
You yourself quoted that part of the manual which mentions it in words: 
"This option turns on recursion and time-stamping".

Cheers,
Greg