Dear David,
On Fri, Dec 18, 2020 at 11:53:08AM +0000, David Waterman wrote:
> The paper "Substructure solution with SHELXD
> <https://journals.iucr.org/d/issues/2002/10/02/gr2280/index.html>"
> (Schneider & Sheldrick, 2002) describes how
>
> data can be truncated at the resolution at which [ΔF to its estimated
> > standard deviation as a function of the resolution] drops to below about 1.3
>
>
> Is this referring to the quantity <|ΔF|>/<σ(ΔF)> calculated in resolution
> shells, or the quantity <|ΔF|/σ(ΔF)> ?
I'm nearly 100% sure this refers to the latter - or at least: the
latter is the only one making sense to me. This sounds very much like
the confusion when it comes to
<I/sig(I)> (1)
==> PDBx/mmCIF: _reflns.pdbx_netI_over_sigmaI 73.6 % of entries
_reflns_shell.pdbx_netI_over_sigmaI_all 0.001% of entries
_reflns_shell.pdbx_netI_over_sigmaI_obs 2.6 % of entries
versus
<I>/<sigI> (2)
==> PDBx/mmCIF: _reflns.pdbx_netI_over_av_sigmaI 2.6 % of entries
_reflns_shell.meanI_over_sigI_all 0.2 % of entries
_reflns_shell.meanI_over_sigI_obs 53.0 % of entries
As far as I can remember, we always computed and reported (1) and
never (2) - at least when it comes to the scaling/merging programs I'm
familiar with (SCALE, XDS/XSCALE, AIMLESS, d*TREK). What useful
information would (2) or <|ΔF|>/<σ(ΔF)> convey anyway ... ?
If we were to believe these definitions, then we are storing the
"right/useful" value <I/sig(I)> in the overall statistics, but a very
different value of <I>/<sig(I)> in the per-shell statistics. All those
_reflns_shell.meanI_over_sigI_obs are most like mis-labeled (1)
quantities.
> This entry
> <https://strucbio.biologie.uni-konstanz.de/ccp4wiki/index.php?title=SHELX_C/D/E#Resolution_cutoff_.28SHEL.29>
> on the ccp4wiki gives a cutoff
>
> where the mean value of |ΔF|/σ(ΔF) falls below about 1.2 (a value of 0.8
> > would indicate pure noise)
>
>
> this version sounds to me like <|ΔF|/σ(ΔF)>
>
> which is the "better" metric, and what do people mean when they say
> DANO/SIGDANO? What is the justification for the 1.3 (or 1.2) value?
I think everyone always refers to <|ΔF|/σ(ΔF)> no matter what it is
called (sometimes programmers shorten the notation to avoid unwieldly
wide columns).
I tend to look for values above 1 (and the higher, the better) - but
maybe even more importantly: check the trend with resolution (higher
at low resolution), maybe in comparison with expectations (type of
scatterer, fluorescence scan, anomalous signal, number of sites,
potential B-factors of scatteres etc).
Cheers
Clemens
########################################################################
To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB&A=1
This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list
hosted by www.jiscmail.ac.uk, terms & conditions are available at
https://www.jiscmail.ac.uk/policyandsecurity/