Michele Lunelli wrote:
Dear all,
I'm a bit confused from the output of the CORRECT step in XDS. In one of the first tables I can read the mean I/sigma for each resolution shell, but these values are much different from the I/sigma reported in the table at the end of the output files, titled "completeness and quality of data set" for the full data range with signal/noise > -3.0. For example, from the first table I have I/sigma = 2 at 3.6 A, while from the second table I have I/sigma = 2 at 2.8 A! What is exactly the difference between the two values? And which one is reliable to decide the resolution cutoff?


Thank you in advance,

Michele Lunelli
MPI for Infection Biology
Berlin - Germany


Michele,

the first table, which is long and fine-grained in resolution, has (citing CORRECT.LP):

  I/Sigma  = mean intensity/Sigma of a reflection in shell

and is thus talking about individual reflections _before_ merging symmetry equivalents.


Later tables have (again citing CORRECT.LP):

 I/SIGMA  = mean of intensity/Sigma(I) of unique reflections
            (after merging symmetry-related observations)

and are thus giving statistics _after_ merging.

The latter tables are suitable for deciding about the resol cutoff.

HTH,

Kay
--
Kay Diederichs                http://strucbio.biologie.uni-konstanz.de
email: [EMAIL PROTECTED]    Tel +49 7531 88 4049 Fax 3183
Fachbereich Biologie, Universität Konstanz, Box M647, D-78457 Konstanz

Attachment: smime.p7s
Description: S/MIME Cryptographic Signature

Reply via email to