I have a series of data coming from an instrument, (e.g.
spectrophotometer), let’s say 10 wavelenghts and 5 repetitions
of the measure.

matrix X (xij)  i=1,10 (wavelenghts)   j=1,5 (measures)

I want to calculate the “mean total signal” with its
uncertainity.

Choice 1) I calculte the mean value of the signal for each of the 10
wavelenghts

(xi mean, mean value over j) with its standard deviation.
I calculate the mean total signal as a sum (over i) and its
relative uncertainity as the square root of the sum of the squared
relative uncertainities

Choice 2) I calculate the sum of the 10 signals for each of the 5
repetitions (sum over i)
I calculte the mean value (over j) and the standard deviation for
these sums

Results are (obviously) different.
What’s the correct procedure? And most of all, why? 

Thanks in advance
References are welcome.

   rob


ex:data
1       2       3       4       5       6       7       8       9       10
11      12      13      14      15      16      17      18      19      20
21      22      23      24      25      26      27      28      29      30
31      32      33      34      35      36      37      38      39      40
41      42      43      44      45      46      47      48      49      50


choice1
21      22      23      24      25      26      27      28      29      30      
mean=255


choice2
55
155
255
355
455
mean=1275
.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to