On Mon, 3 May 2004 20:36:59 -0400, Stan Brown <[EMAIL PROTECTED]> wrote:
> > A correspondent wrote me for permission to quote my rule of thumb > for significant digits in a mean or standard deviation, and in the > course of our discussions I realized I was not very happy with the > rule as I had stated it, even though it was just intended for the > classroom with "toy" data sets. > > Does anyone know of a good clear and simple rule for how many > significant digits (or how many decimal places) should be in the > mean, variance, and standard deviation, based on the significant > digits (or decimal places) in the original data and also, I presume, > on the size of the data set? He might also have reason to mention the number of digits to *state* in the original data -- What is the language that determines 'significant digits'? Whatever it comes to by technical rules of thumb .... You probably should not vary by more than one digit from that practice of the recognized experts of the field on similar data, if you don't want eyebrows to raise. > > Ideal would be a URL that I could pass on to my correspondent, who's > trying to write a manual for use in a chemical laboratory. -- Rich Ulrich, [EMAIL PROTECTED] http://www.pitt.edu/~wpilib/index.html . . ================================================================= Instructions for joining and leaving this list, remarks about the problem of INAPPROPRIATE MESSAGES, and archives are available at: . http://jse.stat.ncsu.edu/ . =================================================================
