Sam had checked in a fix (according to Bonsai) for the proper use of
decimals in PostScript output that used formatted strings like "0.%.0f" to 
hard-code the decimal point as a period.  Unfortunately, this broke the case 
where the R, G, or B color component was full-on (should be "1.0") by forcing
a leading "0.".

I noticed this bug only after trying to figure out why my colors were 
all wrong.  It seems printf is taking numbers like "0.0531123"
and chucking away the "0.0" part.  When a number like this is run through
a formatting like "0.%.0f", the result is "0.531123" -- the result is
an order of magnitude too large.  Printf is taking off all the leading zeroes,
which makes colors with low R, G, or B components come out wrong.

>From the printf manual page, I couldn't quite understand exactly what
it was supposed to do, so I took the easy way out and just called
setlocale(LC_NUMERIC, "C"); [code]; setlocale(LC_NUMERIC, ""); on the
old formatted strings.  Is there a good reason not to do it this way?

-- 
Shaw Terwilliger <[EMAIL PROTECTED]>



Reply via email to