Re: [ccp4bb] AutoXDS

2011-03-03 Thread Tim Gruene
Dear Marco,

I did not know autoxds, but google suggests that it is a csh(uarrrgh)-script
from SLAC. So as long as you have some c-Shell installed on your MAC, to script
should execute.
Maybe you can be more precise about what you mean by your last sentence to get
more accurate help.

Cheers, Tim

On Wed, Mar 02, 2011 at 07:25:36PM +0100, Marco Lolicato wrote:
 Dear all,
 suggestion how to run autoxds on MAC OSX?
 I downloaded a script without success...do you know if are there others?
 
 Thanks,
 
 Marco 

-- 
--
Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

phone: +49 (0)551 39 22149

GPG Key ID = A46BEE1A



signature.asc
Description: Digital signature


Re: [ccp4bb] while on the subject of stereo

2011-03-03 Thread Sergei Strelkov

Dear Dave,

Here come my five pence...
I personally found stereo graphics useful in two cases.

1. When you first introduce students to biomolecular
structure and/or biocrystallography. Showing stereo
certainly helps 'building up' the initial fascination,
which is very important of course. But since we do not
have a classroom stereo setup, I am just speaking
of seating a new/prospective (graduate) student
in front of a stereo workstation.

2. When one performs more difficult tasks while
doing research (although it was not your question).
This includes building difficult regions in poor/low resolution
maps as already mentioned, but maybe even more
importantly when trying to make sense of a difficult
MR case and finally when dealing with protein docking.

However:
1. In my experience, what at least the better students
do is to look at a structure using a simple program
like Swiss PDB Viewer or Rasmol and their 300 euro
laptop. No arguments can persuade them
to use the 3000 euro lab stereo setup -- because they
can manage to see what they want to see by just
rotating the molecule...

2. For both teaching purposes and publications,
I remain an adept of printed stereo pairs.
Get each of your students a 5 euro stereo viewer
and give them a handout full of stereo pairs
rather than mono images. The very important
this is that, on paper, one can make
notes and drawings. An active digestion
of the teaching material (rather than passive
starring at your screen) has been known to help
efficient learning since long ago...

I can summarize my view as follows:
for /most/ purposes, you should be fine
by using one of the two:
simple mono graphics to achieve
the 3D effect by rotation -- or printed
stereo pairs.

HTH
Sergei


Thanks for the comments, I do appreciate them.  I guess we went off in a
direction I wasn't thinking of - related to your personal like or
dislike of stereo.  What I am really looking for is an answer to a
simple question in that is stereo a nice thing from a pedagogy
standpoint for showing students complex biomolecules.

I am in a chemistry department - undergraduate only.  We focus on
3-dimensional shape and the importance of shape of chemical
function/reactivity/etc...  With small molecules (PF5, etc...), it's
easy to see how shape works by simply rotating the molecule.  The
molecules are small enough, the concept of 3D can be visualized easily
in these systems.  Furthermore, they can make a simple model using your
standard organic or inorganic model kit, no worries.

Now, bring in a huge protein, or a protein-protein complex.  The issue
of 3Dness becomes fuzzier.  It's not so easy to see which hydrogen will
get plucked off during a chemical reaction, even with careful zooming
and mouse manipulation.  So my question still is, how many of you feel
stereo is important from a pedagogy standpoint (not looking at maps,
just structures that are huge and complex).  Is it something that we
need to try to bring to the classroom, or is it just a cool toy like the
3D TV that hopefully is going nowhere and will soon fade out like the
viewmaster of old.  I know a large percentage of people cannot see
stereo (at least the way we present it), and so it isn't for everybody.
But, does it help, and if so, does it help when done in a huge classroom
or when put on an individual screen.  Has anybody tried to assess this
(there's a horrible word for you).

That's what I was wondering about.  Presenting the stereo is a different
issue (how is that done), but I think there are lots of avenues for that
depending on your particular situation.

Thanks again

Dave




Re: [ccp4bb] Problem with refinement and positive electron density

2011-03-03 Thread Eleanor Dodson
I think you have been caught by a new REFMAC feature which tries  to 
design its own TLS groups including linked H2Os and ligands.


Check your tls output records and see what it has clustered into a group..

I am not sure how to disable this - at times I want to override any 
automatic selection..

Eleanor

On 03/01/2011 07:48 PM, Judith Reeks wrote:

Dear All,

Thank you for your suggestions. Many of you asked what the occupancies
were in the region and they were all one, so partial occupancy was not
the problem.

I was using TLS restraints during the refinement when this problem
happened. Given the suggestions that TLS may be a problem and that might
be causing the low B-factors, I went back and re-ran the refinement
without TLS and the problem disappeared. Then I submitted my latest file
to the TLSMD server for new restraints and the next round of refinement
got rid of the problem. The B-factors increased to normal levels (~15
compared to ~5 before) so it seems to have done the trick.

Thank you to everybody for their help,

Judith Reeks

ja...@st-andrews.ac.uk

School of Chemistry

University of St Andrews


On 01/03/2011 17:28, Mark Robien wrote:

Hmm - kinda interesting

In addition to the sorts of things suggested by Mark van Raaij, older
versions of Refmac
were prone to have a phenomenon with B factors - once they get over a
certain level, the algorithm
has a very hard time bring them back down, even when the data suggests
it.

I've usually seen it with much higher B factors than you seem to have
here - for example, loops where the B's
are actually 40-60, but previous rounds of refinement have the B
factors 60-80 or higher.
Judging from the coot screen  the residues you are focusing on, I
doubt that is the answer (unless you also
have a TLS model, in which case I'd wonder). If you do have TLS -
well, things get more complicated; for example,
is this the edge of a TLS domain?

Nonetheless, you could try the solution for the problem that I
describe - which is to reset all the B factors to a (very)
low B factor - maybe even as low as 2.0 (lower?), and then another
round of refmac - with a sufficient number of
cycles - will re-refine the B (and xyz), thus escaping the region of
refinement space that has a very weak gradient.

A variant approach that might be appropriate - similarly reset the B
for the (?small) region of your model that has the problem.

Sadly, it's been awhile since I did any refinement myself - but the
uppsala suite had some of the nicer tools for resetting B within pdb
files, without having to do it manually (ugh - not appealing) - or
writing an adhoc awk script (a very easy alternative, if you're
familiar enough).

Mark Robien


On 3/1/2011 10:32 AM, Judith Reeks wrote:

Dear All,

I am currently refining a structure using the latest experimental
version of Refmac (5.6) and there seems to be a problem with my Fo-Fc
map. There is a region where I have fitted residues to the electron
density but after refinement there is still positive electron density
assigned to the region despite the fact that residues fit the
electron density (see link below for a screenshot). Multiple rounds
of refinement have yet to get rid of this problem. I have checked the
PDB file and there does not appear to be any problems with this
region. Has anybody seen something like this before?

http://i1083.photobucket.com/albums/j382/jreeks/Screenshot2011-03-01at1613402.png


Regards,

Judith Reeks

ja...@st-andrews.ac.uk

School of Chemistry

University of St Andrews







Re: [ccp4bb] Is there any program for specifically calculating Rvalue in CCP4

2011-03-03 Thread Eleanor Dodson

On 03/03/2011 06:13 AM, Ting-Wei Jiang wrote:

Dear all experts,

I'm trying to calculate R-value (and free R) specifically  which is between
data and the modified structure(refined by myself without help from any
program).I've looked the program for calculating a long while.Actually,I
found one named Rsearch (CCP4 supported).Nevertheless,I cant find Rsearch in
CCP4 package.
Could anyone direct me on how to get this value? thanks in advance.
:

TingWei




Rsearch was a molecular replacement search program which calculated 
Rvalues for many positions in the unit cell. Not what you need.


If you have a model and observed data the simplest way to get an Rfactor 
is to run 0 cycles of REFMAC. This will give you the Rfactors for your 
model without applying any shifts. It will adjust the scale and Bfactors 
to get the best ft though.


If you already have a file containinf Fobs SigFobs and Fc rstats will 
calculate Rfactors. It is too old to consider Free Rfactors though


All the best Eleanor


[ccp4bb] Scientific Programmer Position

2011-03-03 Thread SHEPARD William
Dear All, 
 
Please find below an announcement for a Scientic Programmer on the PROXIMA 2 
beamline at Synchrotron SOLEIL. The position is a 2-year contract suitable for 
a PXer with strong computing  instrumentation skills or a Computing Engineer. 
 
http://www.synchrotron-soleil.fr/images/File/soleil/DivisionAdministration/Personnel/SOLEIL_postDoc_PROXIMA_2.pdf
 
http://www.synchrotron-soleil.fr/images/File/soleil/DivisionAdministration/Personnel/SOLEIL_postDoc_PROXIMA_2.pdf
 
 
Kind regards,
William SHEPARD
PROXIMA 2 Beamline Responsible
 


[ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Roberto Battistutta
Dear all,
I got a reviewer comment that indicate the need to refine the structures at an 
appropriate resolution (I/sigmaI of 3.0), and re-submit the revised coordinate 
files to the PDB for validation.. In the manuscript I present some crystal 
structures determined by molecular replacement using the same protein in a 
different space group as search model. Does anyone know the origin or the 
theoretical basis of this I/sigmaI 3.0 rule for an appropriate resolution?
Thanks,
Bye,
Roberto.


Roberto Battistutta
Associate Professor
Department of Chemistry
University of Padua
via Marzolo 1, 35131 Padova - ITALY
tel. +39.049.8275265/67
fax. +39.049.8275239
roberto.battistu...@unipd.it
www.chimica.unipd.it/roberto.battistutta/
VIMM (Venetian Institute of Molecular Medicine)
via Orus 2, 35129 Padova - ITALY
tel. +39.049.7923236
fax +39.049.7923250
www.vimm.it


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Eleanor Dodson

No - and I dont think it is accepted practice now  either..

I often use I/SigI  1.5 for refinement..

Look at your Rfactor plots from REFMAC - if they look reasonable at 
higher resolution use the data

Eleanor



On 03/03/2011 11:29 AM, Roberto Battistutta wrote:

Dear all,
I got a reviewer comment that indicate the need to refine the structures at an appropriate 
resolution (I/sigmaI of3.0), and re-submit the revised coordinate files to the PDB for 
validation.. In the manuscript I present some crystal structures determined by molecular 
replacement using the same protein in a different space group as search model. Does anyone know the 
origin or the theoretical basis of this I/sigmaI3.0 rule for an appropriate resolution?
Thanks,
Bye,
Roberto.


Roberto Battistutta
Associate Professor
Department of Chemistry
University of Padua
via Marzolo 1, 35131 Padova - ITALY
tel. +39.049.8275265/67
fax. +39.049.8275239
roberto.battistu...@unipd.it
www.chimica.unipd.it/roberto.battistutta/
VIMM (Venetian Institute of Molecular Medicine)
via Orus 2, 35129 Padova - ITALY
tel. +39.049.7923236
fax +39.049.7923250
www.vimm.it


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Mischa Machius
Roberto,

The reviewer's request is complete nonsense. The problem is how to best and 
politely respond so as not to prevent the paper from being accepted. Best would 
be to have educated editors who could simply tell you to ignore that request.

Since this issue comes up quite often still, I think we all should come up with 
a canned response to such a request.

One way to approach this issue is to avoid saying something like the structure 
has been refined to 2.2Å resolution, but instead say has been refined using 
data to a resolution of 2.2Å., or even has been refined using data with an 
I/sigmaI  1.5 (or whatever). Next could be to point out that even data with 
an I/sigmaI of 1 can contain information (I actually don't have a good 
reference for this, but I'm sure someone else can provide one'), and inclusion 
of such data can improve refinement stability and speed of convergence (not 
really important in a scientific sense, though).

The point is that all of your data combined result in a structure with a 
certain resolution, pretty much no matter what high-resolution limits you 
choose (I/sigmaI of 0.5, 1.0, or 1.5). As long as you don't portrait your 
structure of having a resolution corresponding to the resolution of the 
high-resolution limit of your data, you should be fine.

Now, requesting to toss out data with I/sigmaI of 3 simply reduces the 
resolution of your structure. You could calculate two electron density maps and 
show that your structure does indeed improve when including data with /sigmaI 
of 3. One criterion could be to use the optical resolution of the structure.

Hope that helps.

Best,
MM

On Mar 3, 2011, at 6:29 AM, Roberto Battistutta wrote:

 Dear all,
 I got a reviewer comment that indicate the need to refine the structures at 
 an appropriate resolution (I/sigmaI of 3.0), and re-submit the revised 
 coordinate files to the PDB for validation.. In the manuscript I present 
 some crystal structures determined by molecular replacement using the same 
 protein in a different space group as search model. Does anyone know the 
 origin or the theoretical basis of this I/sigmaI 3.0 rule for an 
 appropriate resolution?
 Thanks,
 Bye,
 Roberto.
 
 
 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY
 tel. +39.049.8275265/67
 fax. +39.049.8275239
 roberto.battistu...@unipd.it
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine)
 via Orus 2, 35129 Padova - ITALY
 tel. +39.049.7923236
 fax +39.049.7923250
 www.vimm.it


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread John R Helliwell
Dear Roberto,
As indicated by others in reply to you the current best practice in
protein crystallography is not a rigid application of such a cut off
criterion. This is because there is such a diverse range of crystal
qualities. However in chemical crystallography where the data quality
from such crystals is more homogeneous such a rule is more often
required notably as a guard against 'fast and loose' data collection
which may occur (to achieve a very high throughput).

As an Editor myself, whilst usually allowing the authors' chosen
resolution cut off, I will insist on the data table saying in a
footnote the diffraction resolution where I/sig(I) crosses 2.0 and/or,
if relevant, where DeltaAnom/sig(DeltaAnom) crosses 1.0.

A remaining possible contentious point with a submitting author is
where the title of a paper may claim a diffraction resolution that in
fact cannot really be substantiated.

Best wishes,
Yours sincerely,
John



On Thu, Mar 3, 2011 at 11:29 AM, Roberto Battistutta
roberto.battistu...@unipd.it wrote:
 Dear all,
 I got a reviewer comment that indicate the need to refine the structures at 
 an appropriate resolution (I/sigmaI of 3.0), and re-submit the revised 
 coordinate files to the PDB for validation.. In the manuscript I present 
 some crystal structures determined by molecular replacement using the same 
 protein in a different space group as search model. Does anyone know the 
 origin or the theoretical basis of this I/sigmaI 3.0 rule for an 
 appropriate resolution?
 Thanks,
 Bye,
 Roberto.


 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY
 tel. +39.049.8275265/67
 fax. +39.049.8275239
 roberto.battistu...@unipd.it
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine)
 via Orus 2, 35129 Padova - ITALY
 tel. +39.049.7923236
 fax +39.049.7923250
 www.vimm.it




-- 
Professor John R Helliwell DSc


Re: [ccp4bb] Problem with refinement and positive electron density

2011-03-03 Thread Judith Reeks

Dear Eleanor,

I don't think that was the case. I add tlsd waters exclude to a 
refinement which as I understand it should prevent that. Also, I checked 
the output files and the TLS groups only involve my protein, waters and 
ligands aren't included anywhere. I can't explain why my TLS restraints 
were giving me problems.


Regards,

Judith

On 03/03/2011 10:49, Eleanor Dodson wrote:
I think you have been caught by a new REFMAC feature which tries  to 
design its own TLS groups including linked H2Os and ligands.


Check your tls output records and see what it has clustered into a 
group..


I am not sure how to disable this - at times I want to override any 
automatic selection..

Eleanor

On 03/01/2011 07:48 PM, Judith Reeks wrote:

Dear All,

Thank you for your suggestions. Many of you asked what the occupancies
were in the region and they were all one, so partial occupancy was not
the problem.

I was using TLS restraints during the refinement when this problem
happened. Given the suggestions that TLS may be a problem and that might
be causing the low B-factors, I went back and re-ran the refinement
without TLS and the problem disappeared. Then I submitted my latest file
to the TLSMD server for new restraints and the next round of refinement
got rid of the problem. The B-factors increased to normal levels (~15
compared to ~5 before) so it seems to have done the trick.

Thank you to everybody for their help,

Judith Reeks

ja...@st-andrews.ac.uk

School of Chemistry

University of St Andrews


On 01/03/2011 17:28, Mark Robien wrote:

Hmm - kinda interesting

In addition to the sorts of things suggested by Mark van Raaij, older
versions of Refmac
were prone to have a phenomenon with B factors - once they get over a
certain level, the algorithm
has a very hard time bring them back down, even when the data suggests
it.

I've usually seen it with much higher B factors than you seem to have
here - for example, loops where the B's
are actually 40-60, but previous rounds of refinement have the B
factors 60-80 or higher.
Judging from the coot screen  the residues you are focusing on, I
doubt that is the answer (unless you also
have a TLS model, in which case I'd wonder). If you do have TLS -
well, things get more complicated; for example,
is this the edge of a TLS domain?

Nonetheless, you could try the solution for the problem that I
describe - which is to reset all the B factors to a (very)
low B factor - maybe even as low as 2.0 (lower?), and then another
round of refmac - with a sufficient number of
cycles - will re-refine the B (and xyz), thus escaping the region of
refinement space that has a very weak gradient.

A variant approach that might be appropriate - similarly reset the B
for the (?small) region of your model that has the problem.

Sadly, it's been awhile since I did any refinement myself - but the
uppsala suite had some of the nicer tools for resetting B within pdb
files, without having to do it manually (ugh - not appealing) - or
writing an adhoc awk script (a very easy alternative, if you're
familiar enough).

Mark Robien


On 3/1/2011 10:32 AM, Judith Reeks wrote:

Dear All,

I am currently refining a structure using the latest experimental
version of Refmac (5.6) and there seems to be a problem with my Fo-Fc
map. There is a region where I have fitted residues to the electron
density but after refinement there is still positive electron density
assigned to the region despite the fact that residues fit the
electron density (see link below for a screenshot). Multiple rounds
of refinement have yet to get rid of this problem. I have checked the
PDB file and there does not appear to be any problems with this
region. Has anybody seen something like this before?

http://i1083.photobucket.com/albums/j382/jreeks/Screenshot2011-03-01at1613402.png 




Regards,

Judith Reeks

ja...@st-andrews.ac.uk

School of Chemistry

University of St Andrews









Re: [ccp4bb] Is there any program for specifically calculating Rvalue in CCP4

2011-03-03 Thread Ed Pozharski
On Thu, 2011-03-03 at 14:13 +0800, Ting-Wei Jiang wrote:
 I'm trying to calculate R-value (and free R) specifically  which is
 between data and the modified structure(refined by myself without help
 from any program).

At long last, someone escaped the tyranny and oppression of the
refinement programs and deciphered the message hidden within the data by
using the mystical power known to the ancients as X-ray vision. Paging
Dan Brown.

On a serious note, if you have the Fo and Fc, the calculation is rather
trivial (if that is the case, let me know and I'll send you a little
piece of code that does that).

However, I suspect that all you have is the model (xyzb for every atom)
and Fo.  In which case there are many ways to do the calulation.

- sfcheck
- refmac with 0 rounds of refinement (already mentioned by Eleanor)
- CNS script model_stats.inp
- clipper libs
- phenix.fmodel will probably output it too or maybe there is some
phenix.get_me_my_damn_rfactor or such

The problem you'll have is that values will be different with different
programs.  Given that your model is exactly the same and presumably the
atomic scattering factors used by these programs are the same, the few
likely reasons are various scaling operations applied.  IMHO, the main
source of difference should be the way one treats the bulk solvent
correction.

HTH, and yes, the first paragraph is meant to be a joke.

Ed.

-- 
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Ed Pozharski
On Thu, 2011-03-03 at 12:29 +0100, Roberto Battistutta wrote:
 Does anyone know the origin or the theoretical basis of this I/sigmaI
 3.0 rule for an appropriate resolution?

There is none.  Did editor ask you to follow this suggestion?  I
wonder if there is anyone among the subscribers of this bb who would
come forward and support this I/sigmaI 3.0 claim.

What was your I/sigma, by the way?  I almost always collect data to
I/sigma=1, which has the downside of generating somewhat higher
R-values.  Shall I, according to this reviewer, retract/amend every
single one of them?  What a mess.

Cheers,

Ed.

-- 
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


Re: [ccp4bb] while on the subject of stereo

2011-03-03 Thread Ed Pozharski
On Wed, 2011-03-02 at 19:23 -0500, David Roberts wrote:
 What I am really looking for is an answer to a 
 simple question in that is stereo a nice thing from a pedagogy 
 standpoint for showing students complex biomolecules. 

Of course it is.  Exactly how much excitement it generates among the
millennials with their iThings is hard to gauge, but it definitely has
greater potential as a teaching tool than the formula for phased
translation function.

Cheers,

Ed. 

-- 
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Van Den Berg, Bert
There seem to be quite a few rule followers out there regarding resolution 
cutoffs. One that I have encountered several times is reviewers objecting to 
high Rsym values (say 60-80% in the last shell), which may be even worse than 
using some fixed value of I/sigI.


On 3/3/11 9:55 AM, Ed Pozharski epozh...@umaryland.edu wrote:

On Thu, 2011-03-03 at 12:29 +0100, Roberto Battistutta wrote:
 Does anyone know the origin or the theoretical basis of this I/sigmaI
 3.0 rule for an appropriate resolution?

There is none.  Did editor ask you to follow this suggestion?  I
wonder if there is anyone among the subscribers of this bb who would
come forward and support this I/sigmaI 3.0 claim.

What was your I/sigma, by the way?  I almost always collect data to
I/sigma=1, which has the downside of generating somewhat higher
R-values.  Shall I, according to this reviewer, retract/amend every
single one of them?  What a mess.

Cheers,

Ed.

--
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs




Re: [ccp4bb] stuck with COOT installation in openSUSE 11.3

2011-03-03 Thread Paul Emsley

On 02/28/2011 09:39 PM, Hena Dutta wrote:


I could not open the COOT GUI after installing either from 
'coot-0.6.1-binary-Linux-x86_64-centos-5-gtk2.tar.gz' or from 
'coot-0.6.2-pre-1-revision-3205-binary-Linux-x86_64-centos-5-gtk2.tar.gz'


I have managed to build a coot pre-release for OpenSuse 11.3 64-bit.

http://lmb.bioch.ox.ac.uk/coot/software/binaries/pre-releases/coot-0.6.2-pre-1-revision-3405-binary-Linux_x86_64-OpenSuse-11.3-gtk2.tar.gz

It should just work.

(This is the non-python version, if you want some fun, try compiling 
python 2.6 on OpenSuse 11.3 either 1) without the system readline or 2) 
with a by-hand installed readline.  Grrr... wretched thing - the only 
people who like python are the ones who don't have to compile it... 
end-rant)


Paul.


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Vellieux Frederic
For myself, I decide on the high resolution cutoff by looking at the 
Rsym vs resolution curve. The curve rises, and for all data sets I have 
processed (so far) there is a break in the curve and the curve shoots 
up. To near vertical. This inflexion point is where I decide to place 
the high resolution cutoff, I never look at the I/sigma(I) values nor at 
the Rsym in the high resolution shell.


As a reviewer, when I have to evaluate a manuscript where very high Rsym 
values are quoted, I have no way of knowing how the high resolution 
cutoff was set. So I simply suggest to the authors to double check this 
cutoff, in order to ensure that the high resolution limit really 
corresponds to high resolution data and not to noise. But I certainly do 
not make statements such as this one.


I have seen cases where, using this rule to decide on the high 
resolution limit, the Rsym in the high resolution bin is well below 50% 
and cases where it is much higher. Like 65%, 70% (0.65, 0.7 if you 
prefer). So, in my opinion, there is no fixed rule as to what the 
acceptable Rsym value in the highest resolution shell should be.


Fred.

Van Den Berg, Bert wrote:
There seem to be quite a few “rule” followers out there regarding 
resolution cutoffs. One that I have encountered several times is 
reviewers objecting to high Rsym values (say 60-80% in the last 
shell), which may be even worse than using some fixed value of I/sigI.



On 3/3/11 9:55 AM, Ed Pozharski epozh...@umaryland.edu wrote:

On Thu, 2011-03-03 at 12:29 +0100, Roberto Battistutta wrote:
 Does anyone know the origin or the theoretical basis of this
I/sigmaI
 3.0 rule for an appropriate resolution?

There is none. Did editor ask you to follow this suggestion? I
wonder if there is anyone among the subscribers of this bb who would
come forward and support this I/sigmaI 3.0 claim.

What was your I/sigma, by the way? I almost always collect data to
I/sigma=1, which has the downside of generating somewhat higher
R-values. Shall I, according to this reviewer, retract/amend every
single one of them? What a mess.

Cheers,

Ed.

--
I'd jump in myself, if I weren't so good at whistling.
Julian, King of Lemurs




Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Jim Pflugrath
As mentioned there is no I/sigmaI rule.  Also you need to specify (and
correctly calculate) I/sigmaI and not I/sigmaI.

A review of similar articles in the same journal will show what is typical
for the journal.  I think you will find that the I/sigmaI cutoff varies.
This information can be used in your response to the reviewer as in, A
review of actual published articles in the Journal shows that 75% (60 out of
80) used an I/sigmaI cutoff of 2 for the resolution of the diffraction
data used in refinement.  We respectfully believe that our cutoff of 2
should be acceptable. 

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of
Roberto Battistutta
Sent: Thursday, March 03, 2011 5:30 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: [ccp4bb] I/sigmaI of 3.0 rule

Dear all,
I got a reviewer comment that indicate the need to refine the structures at
an appropriate resolution (I/sigmaI of 3.0), and re-submit the revised
coordinate files to the PDB for validation.. In the manuscript I present
some crystal structures determined by molecular replacement using the same
protein in a different space group as search model. Does anyone know the
origin or the theoretical basis of this I/sigmaI 3.0 rule for an
appropriate resolution?
Thanks,
Bye,
Roberto.


Roberto Battistutta
Associate Professor
Department of Chemistry
University of Padua
via Marzolo 1, 35131 Padova - ITALY
tel. +39.049.8275265/67
fax. +39.049.8275239
roberto.battistu...@unipd.it
www.chimica.unipd.it/roberto.battistutta/
VIMM (Venetian Institute of Molecular Medicine) via Orus 2, 35129 Padova -
ITALY tel. +39.049.7923236 fax +39.049.7923250 www.vimm.it


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Phil Evans
My preferred criterion is the half-dataset correlation coefficient output by 
Scala (an idea stolen from the EM guys): I tend to cut my data where this falls 
to not less than 0.5.

The good thing about this is that it is independent of the vagaries of 
I/sigma (or rather of the SD estimation) and has a more intuitive cutoff 
point than Rmeas (let alone Rmerge). It probably doesn't work well at low 
multiplicity and there is always a problem with anisotropy (I intend to do 
anisotropic analysis in future)

That said, the exact resolution cut-off is not really important: if you refine 
 look at maps at say 2.6A vs. 2.5A (if that's around the potential cutoff), 
there is probably little significant difference

Phil

On 3 Mar 2011, at 15:34, Jim Pflugrath wrote:

 As mentioned there is no I/sigmaI rule.  Also you need to specify (and
 correctly calculate) I/sigmaI and not I/sigmaI.
 
 A review of similar articles in the same journal will show what is typical
 for the journal.  I think you will find that the I/sigmaI cutoff varies.
 This information can be used in your response to the reviewer as in, A
 review of actual published articles in the Journal shows that 75% (60 out of
 80) used an I/sigmaI cutoff of 2 for the resolution of the diffraction
 data used in refinement.  We respectfully believe that our cutoff of 2
 should be acceptable. 
 
 -Original Message-
 From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of
 Roberto Battistutta
 Sent: Thursday, March 03, 2011 5:30 AM
 To: CCP4BB@JISCMAIL.AC.UK
 Subject: [ccp4bb] I/sigmaI of 3.0 rule
 
 Dear all,
 I got a reviewer comment that indicate the need to refine the structures at
 an appropriate resolution (I/sigmaI of 3.0), and re-submit the revised
 coordinate files to the PDB for validation.. In the manuscript I present
 some crystal structures determined by molecular replacement using the same
 protein in a different space group as search model. Does anyone know the
 origin or the theoretical basis of this I/sigmaI 3.0 rule for an
 appropriate resolution?
 Thanks,
 Bye,
 Roberto.
 
 
 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY
 tel. +39.049.8275265/67
 fax. +39.049.8275239
 roberto.battistu...@unipd.it
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine) via Orus 2, 35129 Padova -
 ITALY tel. +39.049.7923236 fax +39.049.7923250 www.vimm.it


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Bernhard Rupp (Hofkristallrat a.D.)
I think this suppression of high resolution shells via I/sigI cutoffs is
partially attributable to a conceptual misunderstanding of what these (darn)
R-values mean in refinement versus data merging. 

In refinement, even a random atom structure follows the Wilson distribution,
and therefore, even a completely wrong non-centrosymmetric structure will
not  - given proper scaling - give an Rf of more than 59%. 

There is no such limit for the basic linear merging R. However, there is a
simple relation between I/sigI and R-merge (provided no other indecency
has been done to the data). It simply is (BMC) Rm=0.8/I/sigI. I.e. for
I/sigI -0.8 you get 100%, for 2 we obtain 40%, which, interpreted as Rf
would be dreadful, but for I/sigI 3, we get Rm=0.27, and that looks
acceptable for an Rf (or uninformed reviewer).  

Btw, I also wish to point out that the I/sig cutoffs are not exactly the
cutoff criterion for anomalous phasing, a more direct measure is a signal
cutoff such as delF/sig(delF); George I believe uses 1.3 for SAD.
Interestingly, in almost all structures I played with, delF/sig(delF) for
both, noise in anomalous data or no anomalous scatterer present, the
anomalous signal was 0.8. I haven’t figured out yet or proved the statistics
and whether this is generally true or just numerology...

And, the usual biased rant - irrespective of Hamilton tests, nobody really
needs these popular unweighted linear residuals which shall not be named,
particularly on F. They only cause trouble.  

Best regards, BR
-
Bernhard Rupp
001 (925) 209-7429
+43 (676) 571-0536
b...@ruppweb.org
hofkristall...@gmail.com
http://www.ruppweb.org/
-
Structural Biology is the practice of
crystallography without a license.
-

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Bart
Hazes
Sent: Thursday, March 03, 2011 7:08 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule

There seems to be an epidemic of papers with I/Sigma  3 (sometime much
larger). In fact such cases have become so frequent that I fear some people
start to believe that this is the proper procedure. I don't know where that
has come from as the I/Sigma ~ 2 criterion has been established long ago and
many consider that even a tad conservative. It simply pains me to see people
going to the most advanced synchrotrons to boost their highest resolution
data and then simply throw away much of it.

I don't know what has caused this wave of high I/Sigma threshold use but
here are some ideas

- High I/Sigma cutoffs are normal for (S/M)AD data sets where a more strict
focus on data quality is needed.
Perhaps some people have started to think this is the norm.

- For some dataset Rsym goes up strongly while I/SigI is still reasonable. I
personally believe this is due to radiation damage which affects Rsym (which
compares reflections taken after different amounts of exposure) much more
than I/SigI which is based on individual reflections. A good test would be
to see if processing only the first half of the dataset improves Rsym (or
better Rrim)

- Most detectors are square and if the detector is too far from the crystal
then the highest resolution data falls beyond the edges of the detector. In
this case one could, and should, still process data into the corners of the
detector. Data completeness at higher resolution may suffer but each
additional reflection still represents an extra restraint in refinement and
a Fourier term in the map. Due to crystal symmetry the effect on
completeness may even be less than expected.

Bart


On 11-03-03 04:29 AM, Roberto Battistutta wrote:
 Dear all,
 I got a reviewer comment that indicate the need to refine the structures
at an appropriate resolution (I/sigmaI of3.0), and re-submit the revised
coordinate files to the PDB for validation.. In the manuscript I present
some crystal structures determined by molecular replacement using the same
protein in a different space group as search model. Does anyone know the
origin or the theoretical basis of this I/sigmaI3.0 rule for an
appropriate resolution?
 Thanks,
 Bye,
 Roberto.


 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY
 tel. +39.049.8275265/67
 fax. +39.049.8275239
 roberto.battistu...@unipd.it
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine) via Orus 2, 35129 
 Padova - ITALY tel. +39.049.7923236 fax +39.049.7923250 www.vimm.it


-- 



Bart Hazes (Associate Professor)
Dept. of Medical Microbiology  Immunology University of Alberta
1-15 Medical Sciences Building
Edmonton, Alberta
Canada, T6G 2H7
phone:  1-780-492-0042
fax:

Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Ed Pozharski
On Thu, 2011-03-03 at 16:02 +0100, Vellieux Frederic wrote:
 For myself, I decide on the high resolution cutoff by looking at the 
 Rsym vs resolution curve. The curve rises, and for all data sets I
 have 
 processed (so far) there is a break in the curve and the curve shoots 
 up. To near vertical. This inflexion point is where I decide to
 place 
 the high resolution cutoff, I never look at the I/sigma(I) values nor
 at 
 the Rsym in the high resolution shell.
 

Fred,

while your procedure is definitely more sophisticated than what I do,
let me point out that the Rsym is genuinely a bad measure for this, as
it depends strongly on redundancy.  Does more robust measures (e.g.
Rpim) show similar inflexion?  I suspect it will at least shift
towards higher resolution.

Cheers,

Ed.

-- 
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Ed Pozharski
On Thu, 2011-03-03 at 08:08 -0700, Bart Hazes wrote:
 I don't know what has caused this wave of high I/Sigma threshold use
 but 
 here are some ideas
 

It may also be related to what I feel is recent revival of the
significance of the R-values in general.  Lower resolution cutoffs in
this context improve the R-values, which is (incorrectly) perceived as
model improvement.

-- 
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Van Den Berg, Bert
Does the position of this inflection point depend on the redundancy? Maybe it 
does not; for high-redundancy data one would simply get a much higher 
corresponding Rsym.


On 3/3/11 11:13 AM, Ed Pozharski epozh...@umaryland.edu wrote:

On Thu, 2011-03-03 at 16:02 +0100, Vellieux Frederic wrote:
 For myself, I decide on the high resolution cutoff by looking at the
 Rsym vs resolution curve. The curve rises, and for all data sets I
 have
 processed (so far) there is a break in the curve and the curve shoots
 up. To near vertical. This inflexion point is where I decide to
 place
 the high resolution cutoff, I never look at the I/sigma(I) values nor
 at
 the Rsym in the high resolution shell.


Fred,

while your procedure is definitely more sophisticated than what I do,
let me point out that the Rsym is genuinely a bad measure for this, as
it depends strongly on redundancy.  Does more robust measures (e.g.
Rpim) show similar inflexion?  I suspect it will at least shift
towards higher resolution.

Cheers,

Ed.

--
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs




Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Ed Pozharski
On Thu, 2011-03-03 at 09:34 -0600, Jim Pflugrath wrote:
 As mentioned there is no I/sigmaI rule.  Also you need to specify (and
 correctly calculate) I/sigmaI and not I/sigmaI.
 
 A review of similar articles in the same journal will show what is
 typical
 for the journal.  I think you will find that the I/sigmaI cutoff
 varies.
 This information can be used in your response to the reviewer as in,
 A
 review of actual published articles in the Journal shows that 75% (60
 out of
 80) used an I/sigmaI cutoff of 2 for the resolution of the
 diffraction
 data used in refinement.  We respectfully believe that our cutoff of 2
 should be acceptable. 
 

Jim,

Excellent point.  Such statistics would be somewhat tedious to gather
though, does anyone know if I/sigma stats are available for the whole
PDB somewhere?

On your first point though - why is one better than the other?  My
experimental observation is while the two differ significantly at low
resolution (what matters, of course, is I/sigma itself and not the
resolution per se), at high resolution where the cutoff is chosen they
are not that different.  And since the cutoff value itself is rather
arbitrarily chosen, then why I/sigma is better than I/sigma?

Cheers,

Ed.


-- 
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Bernhard Rupp (Hofkristallrat a.D.)
 related to what I feel is recent revival of the significance of the R-values

because it's so handy to have one single number to judge a highly complex 
nonlinear multivariate barely determined regularized problem! Just as easy as 
running a gel!

Best BR

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Ed 
Pozharski
Sent: Thursday, March 03, 2011 8:19 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule

On Thu, 2011-03-03 at 08:08 -0700, Bart Hazes wrote:
 I don't know what has caused this wave of high I/Sigma threshold use 
 but here are some ideas
 

It may also be related to what I feel is recent revival of the significance of 
the R-values in general.  Lower resolution cutoffs in this context improve the 
R-values, which is (incorrectly) perceived as model improvement.

--
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Vellieux Frederic

Hi,

I don't think XDS generates an Rpim value, does it? The XDS CORRECT 
strep provides the old fashioned Rsym (R-FACTOR) plus R-meas and Rmrgd-F.


The curves look all the same though

Fred.

Ed Pozharski wrote:

On Thu, 2011-03-03 at 16:02 +0100, Vellieux Frederic wrote:
  
For myself, I decide on the high resolution cutoff by looking at the 
Rsym vs resolution curve. The curve rises, and for all data sets I
have 
processed (so far) there is a break in the curve and the curve shoots 
up. To near vertical. This inflexion point is where I decide to
place 
the high resolution cutoff, I never look at the I/sigma(I) values nor
at 
the Rsym in the high resolution shell.





Fred,

while your procedure is definitely more sophisticated than what I do,
let me point out that the Rsym is genuinely a bad measure for this, as
it depends strongly on redundancy.  Does more robust measures (e.g.
Rpim) show similar inflexion?  I suspect it will at least shift
towards higher resolution.

Cheers,

Ed.

  


Re: [ccp4bb] Is there any program for specifically calculating Rvalue in CCP4

2011-03-03 Thread Pavel Afonine
Hi,

- phenix.fmodel will probably output it too or maybe there is some
 phenix.get_me_my_damn_rfactor or such


as Eric pointed out earlier, the command

phenix.model_vs_data model.pdb data.mtz

will do exactly this.

Pavel.

P.S.:

phenix.fmodel is the tool to compute total model structure factor, and this
one
phenix.get_me_my_damn_rfactor
does not exist -:)


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Ronald E Stenkamp

Discussions of I/sigma(I) or less-than cutoffs have been going on for at least 
35 years.  For example, see Acta Cryst. (1975) B31, 1507-1509.  I was taught by 
my elders (mainly Lyle Jensen) that less-than cutoffs came into use when 
diffractometers replaced film methods for small molecule work, i.e., 1960s.  To 
compare new and old structures, they needed some criterion for the electronic 
measurements that would correspond to the fog level on their films.  People 
settled on 2 sigma cutoffs (on I which mean 4 sigma on F), but subsequently, 
the cutoffs got higher and higher, as people realized they could get lower and 
lower R values by throwing away the weak reflections.  I'm unaware of any 
statistical justification for any cutoff.  The approach I like the most is to 
refine on Fsquared and use every reflection.  Error estimates and weighting 
schemes should take care of the noise.

Ron

On Thu, 3 Mar 2011, Ed Pozharski wrote:


On Thu, 2011-03-03 at 09:34 -0600, Jim Pflugrath wrote:

As mentioned there is no I/sigmaI rule.  Also you need to specify (and
correctly calculate) I/sigmaI and not I/sigmaI.

A review of similar articles in the same journal will show what is
typical
for the journal.  I think you will find that the I/sigmaI cutoff
varies.
This information can be used in your response to the reviewer as in,
A
review of actual published articles in the Journal shows that 75% (60
out of
80) used an I/sigmaI cutoff of 2 for the resolution of the
diffraction
data used in refinement.  We respectfully believe that our cutoff of 2
should be acceptable.



Jim,

Excellent point.  Such statistics would be somewhat tedious to gather
though, does anyone know if I/sigma stats are available for the whole
PDB somewhere?

On your first point though - why is one better than the other?  My
experimental observation is while the two differ significantly at low
resolution (what matters, of course, is I/sigma itself and not the
resolution per se), at high resolution where the cutoff is chosen they
are not that different.  And since the cutoff value itself is rather
arbitrarily chosen, then why I/sigma is better than I/sigma?

Cheers,

Ed.


--
I'd jump in myself, if I weren't so good at whistling.
  Julian, King of Lemurs



Re: [ccp4bb] I/sigmaI of 3.0 rule- do not underestimate gels

2011-03-03 Thread Felix Frolow
Well BR, do not underestimate complexity of running a gel! There are even more 
harsh referees comments on gel appearance and quality 
than comments on cutting data based on R,RF and sigmaI :-)
 Especially when one is trying to penetrate into prestigious journals...
Dr Felix Frolow   
Professor of Structural Biology and Biotechnology
Department of Molecular Microbiology
and Biotechnology
Tel Aviv University 69978, Israel

Acta Crystallographica F, co-editor

e-mail: mbfro...@post.tau.ac.il
Tel:  ++972-3640-8723
Fax: ++972-3640-9407
Cellular: 0547 459 608

On Mar 3, 2011, at 18:38 , Bernhard Rupp (Hofkristallrat a.D.) wrote:

 related to what I feel is recent revival of the significance of the R-values
 
 because it's so handy to have one single number to judge a highly complex 
 nonlinear multivariate barely determined regularized problem! Just as easy as 
 running a gel!
 
 Best BR
 
 -Original Message-
 From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Ed 
 Pozharski
 Sent: Thursday, March 03, 2011 8:19 AM
 To: CCP4BB@JISCMAIL.AC.UK
 Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule
 
 On Thu, 2011-03-03 at 08:08 -0700, Bart Hazes wrote:
 I don't know what has caused this wave of high I/Sigma threshold use 
 but here are some ideas
 
 
 It may also be related to what I feel is recent revival of the significance 
 of the R-values in general.  Lower resolution cutoffs in this context improve 
 the R-values, which is (incorrectly) perceived as model improvement.
 
 --
 I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Maia Cherney

Dear Bernhard

I am wondering where I should cut my data off. Here is the statistics 
from XDS processing.


Maia

SUBSET OF INTENSITY DATA WITH SIGNAL/NOISE = -3.0 AS FUNCTION OF RESOLUTION
RESOLUTION NUMBER OF REFLECTIONS COMPLET R-FACTOR R-FACTOR COMPARED 
I/SIGMA R-meas Rmrgd-F Anomal SigAno Nano

LIMIT OBSERVED UNIQUE POSSIBLE OF DATA observed expected Corr

10.06 5509 304 364 83.5% 3.0% 4.4% 5509 63.83 3.1% 1.0% 11% 0.652 173
7.12 11785 595 595 100.0% 3.5% 4.8% 11785 59.14 3.6% 1.4% -10% 0.696 414
5.81 15168 736 736 100.0% 5.0% 5.6% 15168 51.88 5.1% 1.8% -9% 0.692 561
5.03 17803 854 854 100.0% 5.5% 5.7% 17803 50.02 5.6% 2.2% -10% 0.738 675
4.50 20258 964 964 100.0% 5.1% 5.4% 20258 52.61 5.3% 2.1% -16% 0.710 782
4.11 22333 1054 1054 100.0% 5.6% 5.7% 22333 50.89 5.8% 2.0% -16% 0.705 878
3.80 23312 1137 1137 100.0% 7.0% 6.6% 23312 42.95 7.1% 3.0% -13% 0.770 952
3.56 25374 1207 1208 99.9% 7.6% 7.3% 25374 40.56 7.8% 3.4% -18% 0.739 1033
3.35 27033 1291 1293 99.8% 9.7% 9.2% 27033 33.73 10.0% 4.1% -12% 0.765 1107
3.18 29488 1353 1353 100.0% 11.6% 11.6% 29488 28.16 11.9% 4.4% -7% 0.750 
1176
3.03 31054 1419 1419 100.0% 15.7% 15.9% 31054 21.77 16.0% 6.9% -9% 0.741 
1243
2.90 32288 1478 1478 100.0% 21.1% 21.6% 32288 16.99 21.6% 9.2% -6% 0.745 
1296
2.79 33807 1542 1542 100.0% 28.1% 28.8% 33807 13.07 28.8% 12.9% -2% 
0.783 1361
2.69 34983 1604 1604 100.0% 37.4% 38.7% 34983 9.95 38.3% 17.2% -2% 0.743 
1422
2.60 35163 1653 1653 100.0% 48.8% 48.0% 35163 8.03 50.0% 21.9% -6% 0.754 
1475

2.52 36690 1699 1699 100.0% 54.0% 56.0% 36690 6.98 55.3% 25.9% 0% 0.745 1517
2.44 37751 1757 1757 100.0% 67.9% 70.4% 37751 5.61 69.5% 32.5% -5% 0.733 
1577

2.37 38484 1798 1799 99.9% 82.2% 84.5% 38484 4.72 84.2% 36.5% 2% 0.753 1620
2.31 39098 1842 1842 100.0% 91.4% 94.3% 39098 4.19 93.7% 43.7% -3% 0.744 
1661
2.25 38809 1873 1923 97.4% 143.4% 139.3% 38809 2.84 147.1% 69.8% -2% 
0.693 1696


total 556190 26160 26274 99.6% 11.9% 12.2% 556190 21.71 12.2% 9.7% -5% 
0.739 22619




Bernhard Rupp (Hofkristallrat a.D.) wrote:

I think this suppression of high resolution shells via I/sigI cutoffs is
partially attributable to a conceptual misunderstanding of what these (darn)
R-values mean in refinement versus data merging. 


In refinement, even a random atom structure follows the Wilson distribution,
and therefore, even a completely wrong non-centrosymmetric structure will
not  - given proper scaling - give an Rf of more than 59%. 


There is no such limit for the basic linear merging R. However, there is a
simple relation between I/sigI and R-merge (provided no other indecency
has been done to the data). It simply is (BMC) Rm=0.8/I/sigI. I.e. for
I/sigI -0.8 you get 100%, for 2 we obtain 40%, which, interpreted as Rf
would be dreadful, but for I/sigI 3, we get Rm=0.27, and that looks
acceptable for an Rf (or uninformed reviewer).  


Btw, I also wish to point out that the I/sig cutoffs are not exactly the
cutoff criterion for anomalous phasing, a more direct measure is a signal
cutoff such as delF/sig(delF); George I believe uses 1.3 for SAD.
Interestingly, in almost all structures I played with, delF/sig(delF) for
both, noise in anomalous data or no anomalous scatterer present, the
anomalous signal was 0.8. I haven’t figured out yet or proved the statistics
and whether this is generally true or just numerology...

And, the usual biased rant - irrespective of Hamilton tests, nobody really
needs these popular unweighted linear residuals which shall not be named,
particularly on F. They only cause trouble.  


Best regards, BR
-
Bernhard Rupp
001 (925) 209-7429
+43 (676) 571-0536
b...@ruppweb.org
hofkristall...@gmail.com
http://www.ruppweb.org/
-

Structural Biology is the practice of
crystallography without a license.
-

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Bart
Hazes
Sent: Thursday, March 03, 2011 7:08 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule

There seems to be an epidemic of papers with I/Sigma  3 (sometime much
larger). In fact such cases have become so frequent that I fear some people
start to believe that this is the proper procedure. I don't know where that
has come from as the I/Sigma ~ 2 criterion has been established long ago and
many consider that even a tad conservative. It simply pains me to see people
going to the most advanced synchrotrons to boost their highest resolution
data and then simply throw away much of it.

I don't know what has caused this wave of high I/Sigma threshold use but
here are some ideas

- High I/Sigma cutoffs are normal for (S/M)AD data sets where a more strict
focus on data quality is needed.
Perhaps some people have started to think this is the norm.

- For some dataset Rsym goes up strongly while I/SigI 

Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Simon Phillips

I take the point about a tendency in those days to apply sigma cutoffs to get 
lower R values, which were erroneously expected to indicate better structures.  
I wonder how many of us remember this paper by Arnberg et al (1979) Acta Cryst 
A35, 497-499, where it is shown for (small molecule) structures that had been 
refined with only reflections I3*sigma(I) that the models were degraded by 
leaving out weak data (although the R factors looked better of course).

Arnberg et al took published structures and showed the refined models got 
better when the weak data were included.  The best bit, I think, was when they 
went on to demonstrate successful refinement of a structure using ONLY the weak 
data where I3*sigma(I) and ignoring all the strong ones.  This shows, as was 
alluded to earlier in the discussion, that a weak reflection puts a powerful 
constraint on a refinement, especially if there are other stronger reflections 
in the same resolution range.

---
| Simon E.V. Phillips |
---
| Director, Research Complex at Harwell (RCaH)|
| Rutherford Appleton Laboratory  |
| Harwell Science and Innovation Campus   |
| Didcot  |
| Oxon OX11 0FA   |
| United Kingdom  |
| Email: simon.phill...@rc-harwell.ac.uk  |
| Tel:   +44 (0)1235 567701   |
|+44 (0)1235 567700 (sec) |
|+44 (0)7884 436011 (mobile)  |
| www.rc-harwell.ac.uk|
---
| Astbury Centre for Structural Molecular Biology |
| Institute of Molecular and Cellular Biology |
| University of LEEDS |
| LEEDS LS2 9JT   |
| United Kingdom  |
| Email: s.e.v.phill...@leeds.ac.uk   |
| Tel:   +44 (0)113 343 3027  |
| WWW:   http://www.astbury.leeds.ac.uk/People/staffpage.php?StaffID=SEVP |
---


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Maia Cherney

I have to resend my statistics.

Maia Cherney wrote:

Dear Bernhard

I am wondering where I should cut my data off. Here is the statistics 
from XDS processing.


Maia





On 11-03-03 04:29 AM, Roberto Battistutta wrote:
 

Dear all,
I got a reviewer comment that indicate the need to refine the 
structures

at an appropriate resolution (I/sigmaI of3.0), and re-submit the 
revised
coordinate files to the PDB for validation.. In the manuscript I 
present
some crystal structures determined by molecular replacement using the 
same

protein in a different space group as search model. Does anyone know the
origin or the theoretical basis of this I/sigmaI3.0 rule for an
appropriate resolution?
 

Thanks,
Bye,
Roberto.


Roberto Battistutta
Associate Professor
Department of Chemistry
University of Padua
via Marzolo 1, 35131 Padova - ITALY
tel. +39.049.8275265/67
fax. +39.049.8275239
roberto.battistu...@unipd.it
www.chimica.unipd.it/roberto.battistutta/
VIMM (Venetian Institute of Molecular Medicine) via Orus 2, 35129 
Padova - ITALY tel. +39.049.7923236 fax +39.049.7923250 www.vimm.it





  




 DETECTOR_SU

 SUBSET OF INTENSITY DATA WITH SIGNAL/NOISE = -3.0 AS FUNCTION OF RESOLUTION
 RESOLUTION NUMBER OF REFLECTIONSCOMPLETENESS R-FACTOR  R-FACTOR 
COMPARED I/SIGMA   R-meas  Rmrgd-F  Anomal  SigAno   Nano
   LIMIT OBSERVED  UNIQUE  POSSIBLE OF DATA   observed  expected
  Corr

10.065509 304   364   83.5%   3.0%  4.4% 
5509   63.83 3.1% 1.0%11%   0.652 173
 7.12   11785 595   595  100.0%   3.5%  4.8%
11785   59.14 3.6% 1.4%   -10%   0.696 414
 5.81   15168 736   736  100.0%   5.0%  5.6%
15168   51.88 5.1% 1.8%-9%   0.692 561
 5.03   17803 854   854  100.0%   5.5%  5.7%
17803   50.02 5.6% 2.2%   -10%   0.738 675
 4.50   20258 964   964  100.0%   5.1%  5.4%
20258   52.61 5.3% 2.1%   -16%   0.710 782
 4.11   223331054  1054  100.0%   5.6%  5.7%
22333   50.89 5.8% 2.0%   -16%   0.705 878
 3.80   233121137  1137  100.0%   7.0%  6.6%
23312   42.95 7.1% 3.0%   -13%   0.770 952
 3.56   253741207  1208   99.9%   7.6%  7.3%
25374   40.56 7.8% 3.4%   -18%   0.7391033
 3.35   270331291  1293   99.8%   9.7%  9.2%
27033   33.7310.0% 4.1%   -12%   0.7651107
 3.18   294881353  1353  100.0%  11.6% 11.6%
29488   28.1611.9% 4.4%-7%   0.7501176
 3.03   310541419  1419  100.0%  15.7% 15.9%
31054   21.7716.0% 6.9%-9%   0.7411243
 2.90   322881478  1478  100.0%  21.1% 21.6%
32288   16.9921.6% 9.2%-6%   0.7451296
 2.79   338071542  1542  100.0%  28.1% 28.8%
33807   13.0728.8%12.9%-2%   0.7831361
 2.69   349831604  1604  100.0%  37.4% 38.7%
349839.9538.3%17.2%-2%   0.7431422
 2.60   351631653  1653  100.0%  48.8% 48.0%
351638.0350.0%21.9%-6%   0.7541475
 2.52   366901699  1699  100.0%  54.0% 56.0%
366906.9855.3%25.9% 0%   0.7451517
 2.44   377511757  1757  100.0%  67.9% 70.4%
377515.6169.5%32.5%-5%   0.7331577
 2.37   384841798  1799   99.9%  82.2% 84.5%
384844.7284.2%36.5% 2%   0.7531620
 2.31   390981842  1842  100.0%  91.4% 94.3%
390984.1993.7%43.7%-3%   0.7441661
 2.25   388091873  1923   97.4% 143.4%139.3%
388092.84   147.1%69.8%-2%   0.6931696
total  556190   26160 26274   99.6%  11.9% 12.2%   
556190   21.7112.2% 9.7%-5%   0.739   22619



Re: [ccp4bb] I/sigmaI of 3.0 rule- do not underestimate gels

2011-03-03 Thread Bernhard Rupp (Hofkristallrat a.D.)
there are even more harsh referees comments on gel appearance and quality
than comments on cutting data based on R,RF and sigmaI :-)  Especially when
one is trying to penetrate into prestigious journals...

Ok I repent. For improving gels there is the same excellent program, also
useful for density modification - Photoshop ;-)

Best, BR

Dr Felix Frolow   
Professor of Structural Biology and Biotechnology Department of Molecular
Microbiology and Biotechnology Tel Aviv University 69978, Israel

Acta Crystallographica F, co-editor

e-mail: mbfro...@post.tau.ac.il
Tel:  ++972-3640-8723
Fax: ++972-3640-9407
Cellular: 0547 459 608

On Mar 3, 2011, at 18:38 , Bernhard Rupp (Hofkristallrat a.D.) wrote:

 related to what I feel is recent revival of the significance of the 
 R-values
 
 because it's so handy to have one single number to judge a highly
complex nonlinear multivariate barely determined regularized problem! Just
as easy as running a gel!
 
 Best BR
 
 -Original Message-
 From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of 
 Ed Pozharski
 Sent: Thursday, March 03, 2011 8:19 AM
 To: CCP4BB@JISCMAIL.AC.UK
 Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule
 
 On Thu, 2011-03-03 at 08:08 -0700, Bart Hazes wrote:
 I don't know what has caused this wave of high I/Sigma threshold use 
 but here are some ideas
 
 
 It may also be related to what I feel is recent revival of the
significance of the R-values in general.  Lower resolution cutoffs in this
context improve the R-values, which is (incorrectly) perceived as model
improvement.
 
 --
 I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Van Den Berg, Bert
We should compile this discussion and send it as compulsive reading to journal 
editors...;-)

Bert


On 3/3/11 12:07 PM, Simon Phillips s.e.v.phill...@leeds.ac.uk wrote:



I take the point about a tendency in those days to apply sigma cutoffs to get 
lower R values, which were erroneously expected to indicate better structures.  
I wonder how many of us remember this paper by Arnberg et al (1979) Acta Cryst 
A35, 497-499, where it is shown for (small molecule) structures that had been 
refined with only reflections I3*sigma(I) that the models were degraded by 
leaving out weak data (although the R factors looked better of course).

Arnberg et al took published structures and showed the refined models got 
better when the weak data were included.  The best bit, I think, was when they 
went on to demonstrate successful refinement of a structure using ONLY the weak 
data where I3*sigma(I) and ignoring all the strong ones.  This shows, as was 
alluded to earlier in the discussion, that a weak reflection puts a powerful 
constraint on a refinement, especially if there are other stronger reflections 
in the same resolution range.

---
| Simon E.V. Phillips |
---
| Director, Research Complex at Harwell (RCaH)|
| Rutherford Appleton Laboratory  |
| Harwell Science and Innovation Campus   |
| Didcot  |
| Oxon OX11 0FA   |
| United Kingdom  |
| Email: simon.phill...@rc-harwell.ac.uk  |
| Tel:   +44 (0)1235 567701   |
|+44 (0)1235 567700 (sec) |
|+44 (0)7884 436011 (mobile)  |
| www.rc-harwell.ac.uk|
---
| Astbury Centre for Structural Molecular Biology |
| Institute of Molecular and Cellular Biology |
| University of LEEDS |
| LEEDS LS2 9JT   |
| United Kingdom  |
| Email: s.e.v.phill...@leeds.ac.uk   |
| Tel:   +44 (0)113 343 3027  |
| WWW:   http://www.astbury.leeds.ac.uk/People/staffpage.php?StaffID=SEVP 
http://www.astbury.leeds.ac.uk/People/staffpage.php?StaffID=SEVP  |
---



Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Jacob Keller
When will we finally jettison Rsym/Rcryst/Rmerge?

1. Perhaps software developers should either not even calculate the
number, or hide it somewhere obscure, and of course replacing it with
a better R flavor?

2. Maybe reviewers should insist on other R's (Rpim etc) instead of Rmerge?

JPK

PS is this as quixotic as chucking the QWERTY keyboard, or using
Esperanto? I don't think so!


On Thu, Mar 3, 2011 at 11:07 AM, Simon Phillips
s.e.v.phill...@leeds.ac.uk wrote:

 I take the point about a tendency in those days to apply sigma cutoffs to
 get lower R values, which were erroneously expected to indicate better
 structures.  I wonder how many of us remember this paper by Arnberg et al
 (1979) Acta Cryst A35, 497-499, where it is shown for (small molecule)
 structures that had been refined with only reflections I3*sigma(I) that the
 models were degraded by leaving out weak data (although the R factors looked
 better of course).

 Arnberg et al took published structures and showed the refined models got
 better when the weak data were included.  The best bit, I think, was when
 they went on to demonstrate successful refinement of a structure using ONLY
 the weak data where I3*sigma(I) and ignoring all the strong ones.  This
 shows, as was alluded to earlier in the discussion, that a weak reflection
 puts a powerful constraint on a refinement, especially if there are other
 stronger reflections in the same resolution range.

 ---
 | Simon E.V. Phillips |
 ---
 | Director, Research Complex at Harwell (RCaH)    |
 | Rutherford Appleton Laboratory  |
 | Harwell Science and Innovation Campus   |
 | Didcot  |
 | Oxon OX11 0FA   |
 | United Kingdom  |
 | Email: simon.phill...@rc-harwell.ac.uk  |
 | Tel:   +44 (0)1235 567701   |
 |    +44 (0)1235 567700 (sec) |
 |    +44 (0)7884 436011 (mobile)  |
 | www.rc-harwell.ac.uk    |
 ---
 | Astbury Centre for Structural Molecular Biology |
 | Institute of Molecular and Cellular Biology |
 | University of LEEDS |
 | LEEDS LS2 9JT   |
 | United Kingdom  |
 | Email: s.e.v.phill...@leeds.ac.uk   |
 | Tel:   +44 (0)113 343 3027  |
 | WWW:   http://www.astbury.leeds.ac.uk/People/staffpage.php?StaffID=SEVP |
 ---



-- 
***
Jacob Pearson Keller
Northwestern University
Medical Scientist Training Program
cel: 773.608.9185
email: j-kell...@northwestern.edu
***


Re: [ccp4bb] Problem with refinement and positive electron density

2011-03-03 Thread Ethan Merritt
On Thursday, March 03, 2011 05:10:02 am Judith Reeks wrote:
 Dear Eleanor,
 
 I don't think that was the case. I add tlsd waters exclude to a 
 refinement which as I understand it should prevent that. Also, I checked 
 the output files and the TLS groups only involve my protein, waters and 
 ligands aren't included anywhere. I can't explain why my TLS restraints 
 were giving me problems.

Without looking at your files I can't be certain, but

The usual problem is that refmac applies clipping limits to B factors.
In the absence of TLS, this means that the minimum allowed B is
(by default) 2.0 and the maximum is (I think) 200.   So far. so good.
But in the presence of a TLS model,  these same clipping limits are applied
to the _incremental_ Biso values, and that's just wrong.  I.e., if the
current set of TLS parameters over-estimates B for some particular
atom, the density gradient will try to drive the incremental B value
for that atom negative to compensate.  But the program doesn't let
that happen.  The result is that the incremental B values bottom out at 2.0
and the refinement goes pear-shaped.

I think the simplest fix for this would be to have refmac detect that
it has happened and modify the current TLS parameters to shift all
the predicted B values downward.  That way the required incremental
Biso become 0 and the problem is avoided. 
An alternative, but more difficult to implement, fix would be to let the
incremental B values go negative.  

The paired *.pdb and *.tlsin files output by the TLSMD server specifically
avoid this problem:  the TLS parameters are chosen such that the 
incremental B values required to best reproduce your input model are
always positive.   But as you continue refinement this criteria may 
gradually be lost, and as noted above refmac doesn't currently detect
or fix this for you.

Ethan


 
 Regards,
 
 Judith
 
 On 03/03/2011 10:49, Eleanor Dodson wrote:
  I think you have been caught by a new REFMAC feature which tries  to 
  design its own TLS groups including linked H2Os and ligands.
 
  Check your tls output records and see what it has clustered into a 
  group..
 
  I am not sure how to disable this - at times I want to override any 
  automatic selection..
  Eleanor
 
  On 03/01/2011 07:48 PM, Judith Reeks wrote:
  Dear All,
 
  Thank you for your suggestions. Many of you asked what the occupancies
  were in the region and they were all one, so partial occupancy was not
  the problem.
 
  I was using TLS restraints during the refinement when this problem
  happened. Given the suggestions that TLS may be a problem and that might
  be causing the low B-factors, I went back and re-ran the refinement
  without TLS and the problem disappeared. Then I submitted my latest file
  to the TLSMD server for new restraints and the next round of refinement
  got rid of the problem. The B-factors increased to normal levels (~15
  compared to ~5 before) so it seems to have done the trick.
 
  Thank you to everybody for their help,
 
  Judith Reeks
 
  ja...@st-andrews.ac.uk
 
  School of Chemistry
 
  University of St Andrews
 
 
  On 01/03/2011 17:28, Mark Robien wrote:
  Hmm - kinda interesting
 
  In addition to the sorts of things suggested by Mark van Raaij, older
  versions of Refmac
  were prone to have a phenomenon with B factors - once they get over a
  certain level, the algorithm
  has a very hard time bring them back down, even when the data suggests
  it.
 
  I've usually seen it with much higher B factors than you seem to have
  here - for example, loops where the B's
  are actually 40-60, but previous rounds of refinement have the B
  factors 60-80 or higher.
  Judging from the coot screen  the residues you are focusing on, I
  doubt that is the answer (unless you also
  have a TLS model, in which case I'd wonder). If you do have TLS -
  well, things get more complicated; for example,
  is this the edge of a TLS domain?
 
  Nonetheless, you could try the solution for the problem that I
  describe - which is to reset all the B factors to a (very)
  low B factor - maybe even as low as 2.0 (lower?), and then another
  round of refmac - with a sufficient number of
  cycles - will re-refine the B (and xyz), thus escaping the region of
  refinement space that has a very weak gradient.
 
  A variant approach that might be appropriate - similarly reset the B
  for the (?small) region of your model that has the problem.
 
  Sadly, it's been awhile since I did any refinement myself - but the
  uppsala suite had some of the nicer tools for resetting B within pdb
  files, without having to do it manually (ugh - not appealing) - or
  writing an adhoc awk script (a very easy alternative, if you're
  familiar enough).
 
  Mark Robien
 
 
  On 3/1/2011 10:32 AM, Judith Reeks wrote:
  Dear All,
 
  I am currently refining a structure using the latest experimental
  version of Refmac (5.6) and there seems to be a problem with my Fo-Fc
  map. There is a region where I have 

Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Bernhard Rupp (Hofkristallrat a.D.)
First of all I would ask a XDS expert for that because I don't know exactly
what stats the XDS program reports (shame on me, ok) nor what the quality of
your error model is, or what you want to use the data for (I guess
refinement - see Eleanor's response for that, and use all data).

There is one point I'd like to make re cutoff: If one gets greedy and
collects too much noise in high resolution shells (like way below I/sigI =
0.8 or so) the scaling/integration may suffer from an overabundance of
nonsense data, and here I believe it makes sense to select a higher cutoff
(like what exactly?) and reprocess the data. Maybe one of our data
collection specialist should comment on that.

BR

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Maia
Cherney
Sent: Thursday, March 03, 2011 9:13 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule

I have to resend my statistics.

Maia Cherney wrote:
 Dear Bernhard

 I am wondering where I should cut my data off. Here is the statistics 
 from XDS processing.

 Maia




 On 11-03-03 04:29 AM, Roberto Battistutta wrote:
  
 Dear all,
 I got a reviewer comment that indicate the need to refine the 
 structures
 
 at an appropriate resolution (I/sigmaI of3.0), and re-submit the 
 revised coordinate files to the PDB for validation.. In the 
 manuscript I present some crystal structures determined by molecular 
 replacement using the same protein in a different space group as 
 search model. Does anyone know the origin or the theoretical basis of 
 this I/sigmaI3.0 rule for an appropriate resolution?
  
 Thanks,
 Bye,
 Roberto.


 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY
 tel. +39.049.8275265/67
 fax. +39.049.8275239
 roberto.battistu...@unipd.it
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine) via Orus 2, 35129 
 Padova - ITALY tel. +39.049.7923236 fax +39.049.7923250 www.vimm.it

 

   




[ccp4bb] [Fwd: Re: [ccp4bb] I/sigmaI of 3.0 rule]

2011-03-03 Thread Maia Cherney



 Original Message 
Subject:Re: [ccp4bb] I/sigmaI of 3.0 rule
Date:   Thu, 03 Mar 2011 10:43:23 -0700
From:   Maia Cherney ch...@ualberta.ca
To: Oganesyan, Vaheh oganesy...@medimmune.com
References: 	2ba9ce2f-c299-4ca9-a36a-99065d1b3...@unipd.it 
4d6faed8.7040...@ualberta.ca 
021001cbd9bc$f0ecc940$d2c65bc0$@gmail.com 
4d6fcab6.3090...@ualberta.ca 4d6fcbff.2010...@ualberta.ca 
73e543de77290c409c9bed6fa4ca34bb0173a...@md1ev002.medimmune.com




Vaheh,

The problem was with Rmerg. As you can see at I/sigma=2.84, the Rmerge 
(R-factor) was 143%. I am asking this question because B. Rupp wrote
However, there is a simple relation between I/sigI and R-merge 
(provided no other indecency has been done to the data). It simply is 
(BMC) Rm=0.8/I/sigI.

Maybe my data are indecent? This is the whole LP file.

Maia





 **
XSCALE (VERSION  December 6, 2007)28-Aug-2009
 **

 Author: Wolfgang Kabsch
 Copy licensed until (unlimited) to
  Canadian Light Source, Saskatoon, Canada.
 No redistribution.


 **
  CONTROL CARDS
 **

 MAXIMUM_NUMBER_OF_PROCESSORS=8
 SPACE_GROUP_NUMBER=180
 UNIT_CELL_CONSTANTS= 150.1 150.1  81.8  90.0  90.0 120.0  
 OUTPUT_FILE=XSCALE.HKL
 FRIEDEL'S_LAW=TRUE
 INPUT_FILE= XDS_ASCII.HKL  XDS_ASCII  
 INCLUDE_RESOLUTION_RANGE= 40  2.25

 THE DATA COLLECTION STATISTICS REPORTED BELOW ASSUMES:
 SPACE_GROUP_NUMBER=  180
 UNIT_CELL_CONSTANTS=   150.10   150.1081.80  90.000  90.000 120.000

 * 12 EQUIVALENT POSITIONS IN SPACE GROUP #180 *

If x',y',z' is an equivalent position to x,y,z, then
x'=x*ML(1)+y*ML( 2)+z*ML( 3)+ML( 4)/12.0
y'=x*ML(5)+y*ML( 6)+z*ML( 7)+ML( 8)/12.0
z'=x*ML(9)+y*ML(10)+z*ML(11)+ML(12)/12.0

#1  2  3  45  6  7  89 10 11 12 
11  0  0  00  1  0  00  0  1  0
20 -1  0  01 -1  0  00  0  1  8
3   -1  1  0  0   -1  0  0  00  0  1  4
4   -1  0  0  00 -1  0  00  0  1  0
50  1  0  0   -1  1  0  00  0  1  8
61 -1  0  01  0  0  00  0  1  4
70  1  0  01  0  0  00  0 -1  8
8   -1  0  0  0   -1  1  0  00  0 -1  4
91 -1  0  00 -1  0  00  0 -1  0
   100 -1  0  0   -1  0  0  00  0 -1  8
   111  0  0  01 -1  0  00  0 -1  4
   12   -1  1  0  00  1  0  00  0 -1  0
 

 ALL DATA SETS WILL BE SCALED TO XDS_ASCII.HKL  
   


 **
READING INPUT REFLECTION DATA FILES
 **


 DATAMEAN   REFLECTIONSINPUT FILE NAME
 SET# INTENSITY  ACCEPTED REJECTED
   1  0.6203E+03   557303  0  XDS_ASCII.HKL 


 **
   CORRECTION FACTORS AS FUNCTION OF IMAGE NUMBER  RESOLUTION
 **

 RECIPROCAL CORRECTION FACTORS FOR INPUT DATA SETS MERGED TO
 OUTPUT FILE: XSCALE.HKL

 THE CALCULATIONS ASSUME FRIEDEL'S_LAW= TRUE
 TOTAL NUMBER OF CORRECTION FACTORS DEFINED  720
 DEGREES OF FREEDOM OF CHI^2 FIT140494.9
 CHI^2-VALUE OF FIT OF CORRECTION FACTORS  1.037
 NUMBER OF CYCLES CARRIED OUT  3

 CORRECTION FACTORS for visual inspection with VIEW DECAY_001.pck   
 INPUT_FILE=XDS_ASCII.HKL 
 XMIN= 0.1 XMAX=   179.9 NXBIN=   36
 YMIN= 0.00257 YMAX= 0.19752 NYBIN=   20
 NUMBER OF REFLECTIONS USED FOR DETERMINING CORRECTION FACTORS 238321


 **
  CORRECTION FACTORS AS FUNCTION OF X (fast)  Y(slow) IN THE DETECTOR PLANE
 **

 RECIPROCAL CORRECTION FACTORS FOR INPUT DATA SETS MERGED TO
 OUTPUT FILE: XSCALE.HKL

 THE CALCULATIONS ASSUME FRIEDEL'S_LAW= TRUE
 TOTAL NUMBER OF CORRECTION FACTORS DEFINED 4760
 DEGREES OF FREEDOM OF CHI^2 FIT186486.8
 CHI^2-VALUE OF FIT OF CORRECTION 

Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Roberto Battistutta
just to clarify that, at least in my case, my impression is that the editor was 
fair, I was referring only to the comment of one reviewer.

Roberto


Roberto Battistutta
Associate Professor
Department of Chemistry
University of Padua
via Marzolo 1, 35131 Padova - ITALY
tel. +39.049.8275265/67
fax. +39.049.8275239
roberto.battistu...@unipd.it
www.chimica.unipd.it/roberto.battistutta/
VIMM (Venetian Institute of Molecular Medicine)
via Orus 2, 35129 Padova - ITALY
tel. +39.049.7923236
fax +39.049.7923250
www.vimm.it

Il giorno 03/mar/2011, alle ore 18.16, Van Den Berg, Bert ha scritto:

 We should compile this discussion and send it as compulsive reading to 
 journal editors...;-)
 
 Bert


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Maia Cherney
I see, there is no consensus about my data. Some people say 2.4A, other 
say all. Well, I chose 2.3 A. My rule was to be a little bit below Rmerg 
100%. At 2.3A Rmerg was 98.7%
Actually, I have published my paper in JMB. Yes, reviewers did not like 
that and even made me give Rrim and Rpim etc.


Maia



Bernhard Rupp (Hofkristallrat a.D.) wrote:

First of all I would ask a XDS expert for that because I don't know exactly
what stats the XDS program reports (shame on me, ok) nor what the quality of
your error model is, or what you want to use the data for (I guess
refinement - see Eleanor's response for that, and use all data).

There is one point I'd like to make re cutoff: If one gets greedy and
collects too much noise in high resolution shells (like way below I/sigI =
0.8 or so) the scaling/integration may suffer from an overabundance of
nonsense data, and here I believe it makes sense to select a higher cutoff
(like what exactly?) and reprocess the data. Maybe one of our data
collection specialist should comment on that.

BR

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Maia
Cherney
Sent: Thursday, March 03, 2011 9:13 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule

I have to resend my statistics.

Maia Cherney wrote:
  

Dear Bernhard

I am wondering where I should cut my data off. Here is the statistics 
from XDS processing.


Maia




On 11-03-03 04:29 AM, Roberto Battistutta wrote:
 
  

Dear all,
I got a reviewer comment that indicate the need to refine the 
structures


at an appropriate resolution (I/sigmaI of3.0), and re-submit the 
revised coordinate files to the PDB for validation.. In the 
manuscript I present some crystal structures determined by molecular 
replacement using the same protein in a different space group as 
search model. Does anyone know the origin or the theoretical basis of 
this I/sigmaI3.0 rule for an appropriate resolution?
 
  

Thanks,
Bye,
Roberto.


Roberto Battistutta
Associate Professor
Department of Chemistry
University of Padua
via Marzolo 1, 35131 Padova - ITALY
tel. +39.049.8275265/67
fax. +39.049.8275239
roberto.battistu...@unipd.it
www.chimica.unipd.it/roberto.battistutta/
VIMM (Venetian Institute of Molecular Medicine) via Orus 2, 35129 
Padova - ITALY tel. +39.049.7923236 fax +39.049.7923250 www.vimm.it




  
  




  


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Maksymilian Chruszcz
Dear All,

Relatively recent statistics on I/sigmaI and Rmerge in PDB deposits are
presented in two following publications:

1.Benefits of structural genomics for drug discovery research.
Grabowski M, Chruszcz M, Zimmerman MD, Kirillova O, Minor W.
Infect Disord Drug Targets. 2009 Nov;9(5):459-74.
PMID: 19594422

2. X-ray diffraction experiment-the last experiment in the structure
elucidation process.
Chruszcz M, Borek D, Domagalski M, Otwinowski Z, Minor W.
Adv Protein Chem Struct Biol. 2009;77:23-40
PMID: 20663480

Best regards,

Maksattachment: I_over_sigma_I.png

Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Tim Gruene
Hello Maia,

Rmerge is obsolete, so the reviewers had a good point to make you publish Rmeas
instead. Rmeas should replace Rmerge in my opinion.

The data statistics you sent show a mulltiplicity of about 20! Did you check 
your
data for radiation damage? That might explain why your Rmeas is so utterly high
while your I/sigI is still above 2 (You should not cut your data but include
more!)

What do the statistics look like if you process just about enough frames so that
you get a reasonable mulltiplicity, 3-4, say?

Cheers, Tim

On Thu, Mar 03, 2011 at 10:57:37AM -0700, Maia Cherney wrote:
 I see, there is no consensus about my data. Some people say 2.4A,
 other say all. Well, I chose 2.3 A. My rule was to be a little bit
 below Rmerg 100%. At 2.3A Rmerg was 98.7%
 Actually, I have published my paper in JMB. Yes, reviewers did not
 like that and even made me give Rrim and Rpim etc.
 
 Maia
 
 
 
 Bernhard Rupp (Hofkristallrat a.D.) wrote:
 First of all I would ask a XDS expert for that because I don't know exactly
 what stats the XDS program reports (shame on me, ok) nor what the quality of
 your error model is, or what you want to use the data for (I guess
 refinement - see Eleanor's response for that, and use all data).
 
 There is one point I'd like to make re cutoff: If one gets greedy and
 collects too much noise in high resolution shells (like way below I/sigI =
 0.8 or so) the scaling/integration may suffer from an overabundance of
 nonsense data, and here I believe it makes sense to select a higher cutoff
 (like what exactly?) and reprocess the data. Maybe one of our data
 collection specialist should comment on that.
 
 BR
 
 -Original Message-
 From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Maia
 Cherney
 Sent: Thursday, March 03, 2011 9:13 AM
 To: CCP4BB@JISCMAIL.AC.UK
 Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule
 
 I have to resend my statistics.
 
 Maia Cherney wrote:
 Dear Bernhard
 
 I am wondering where I should cut my data off. Here is the
 statistics from XDS processing.
 
 Maia
 
 
 On 11-03-03 04:29 AM, Roberto Battistutta wrote:
 Dear all,
 I got a reviewer comment that indicate the need to refine
 the structures
 at an appropriate resolution (I/sigmaI of3.0), and re-submit
 the revised coordinate files to the PDB for validation.. In
 the manuscript I present some crystal structures determined by
 molecular replacement using the same protein in a different
 space group as search model. Does anyone know the origin or
 the theoretical basis of this I/sigmaI3.0 rule for an
 appropriate resolution?
 Thanks,
 Bye,
 Roberto.
 
 
 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY
 tel. +39.049.8275265/67
 fax. +39.049.8275239
 roberto.battistu...@unipd.it
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine) via Orus 2,
 35129 Padova - ITALY tel. +39.049.7923236 fax
 +39.049.7923250 www.vimm.it
 
 
 

-- 
--
Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

phone: +49 (0)551 39 22149

GPG Key ID = A46BEE1A



signature.asc
Description: Digital signature


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Bernhard Rupp (Hofkristallrat a.D.)
 The data statistics you sent show a mulltiplicity of about 20! Did you
check your data for radiation damage? That might explain why your Rmeas is
so utterly high while your I/sigI is still above 2 (You should not cut your
data but include more!)

So then I got that wrong - with that *high* a redundancy, the preceding term
becomes ~1 and linear Rmerge and Rmeas asymptotically become the same?

BR 

Cheers, Tim

On Thu, Mar 03, 2011 at 10:57:37AM -0700, Maia Cherney wrote:
 I see, there is no consensus about my data. Some people say 2.4A, 
 other say all. Well, I chose 2.3 A. My rule was to be a little bit 
 below Rmerg 100%. At 2.3A Rmerg was 98.7% Actually, I have published 
 my paper in JMB. Yes, reviewers did not like that and even made me 
 give Rrim and Rpim etc.
 
 Maia
 
 
 
 Bernhard Rupp (Hofkristallrat a.D.) wrote:
 First of all I would ask a XDS expert for that because I don't know 
 exactly what stats the XDS program reports (shame on me, ok) nor what 
 the quality of your error model is, or what you want to use the data 
 for (I guess refinement - see Eleanor's response for that, and use all
data).
 
 There is one point I'd like to make re cutoff: If one gets greedy and 
 collects too much noise in high resolution shells (like way below 
 I/sigI =
 0.8 or so) the scaling/integration may suffer from an overabundance 
 of nonsense data, and here I believe it makes sense to select a 
 higher cutoff (like what exactly?) and reprocess the data. Maybe one 
 of our data collection specialist should comment on that.
 
 BR
 
 -Original Message-
 From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of 
 Maia Cherney
 Sent: Thursday, March 03, 2011 9:13 AM
 To: CCP4BB@JISCMAIL.AC.UK
 Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule
 
 I have to resend my statistics.
 
 Maia Cherney wrote:
 Dear Bernhard
 
 I am wondering where I should cut my data off. Here is the 
 statistics from XDS processing.
 
 Maia
 
 
 On 11-03-03 04:29 AM, Roberto Battistutta wrote:
 Dear all,
 I got a reviewer comment that indicate the need to refine the 
 structures
 at an appropriate resolution (I/sigmaI of3.0), and re-submit the 
 revised coordinate files to the PDB for validation.. In the 
 manuscript I present some crystal structures determined by 
 molecular replacement using the same protein in a different space 
 group as search model. Does anyone know the origin or the 
 theoretical basis of this I/sigmaI3.0 rule for an appropriate 
 resolution?
 Thanks,
 Bye,
 Roberto.
 
 
 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY tel. +39.049.8275265/67 fax. 
 +39.049.8275239 roberto.battistu...@unipd.it 
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine) via Orus 2,
 35129 Padova - ITALY tel. +39.049.7923236 fax
 +39.049.7923250 www.vimm.it
 
 
 

--
--
Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

phone: +49 (0)551 39 22149

GPG Key ID = A46BEE1A


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Bernhard Rupp (Hofkristallrat a.D.)
 Rmeas is always higher than Rmerge, so if my Rmerg is high I don't like
Rmeas either.

But that makes perfect sense now per Tim: the linear Rmerge gives for small
N (lower redundancy) always lower values and rises with redundancy to
approach Rmeas/rim for high redundancy. 

 I like the idea just to look at the I/sigI and include more data.

Lucky me to suggest  to use all your present data for refinement... ;-)

BR

Maia

Tim Gruene wrote:
 Hello Maia,

 Rmerge is obsolete, so the reviewers had a good point to make you 
 publish Rmeas instead. Rmeas should replace Rmerge in my opinion.

 The data statistics you sent show a mulltiplicity of about 20! Did you 
 check your data for radiation damage? That might explain why your 
 Rmeas is so utterly high while your I/sigI is still above 2 (You 
 should not cut your data but include
 more!)

 What do the statistics look like if you process just about enough 
 frames so that you get a reasonable mulltiplicity, 3-4, say?

 Cheers, Tim

 On Thu, Mar 03, 2011 at 10:57:37AM -0700, Maia Cherney wrote:
   
 I see, there is no consensus about my data. Some people say 2.4A, 
 other say all. Well, I chose 2.3 A. My rule was to be a little bit 
 below Rmerg 100%. At 2.3A Rmerg was 98.7% Actually, I have published 
 my paper in JMB. Yes, reviewers did not like that and even made me 
 give Rrim and Rpim etc.

 Maia



 Bernhard Rupp (Hofkristallrat a.D.) wrote:
 
 First of all I would ask a XDS expert for that because I don't know 
 exactly what stats the XDS program reports (shame on me, ok) nor 
 what the quality of your error model is, or what you want to use the 
 data for (I guess refinement - see Eleanor's response for that, and use
all data).

 There is one point I'd like to make re cutoff: If one gets greedy 
 and collects too much noise in high resolution shells (like way 
 below I/sigI =
 0.8 or so) the scaling/integration may suffer from an overabundance 
 of nonsense data, and here I believe it makes sense to select a 
 higher cutoff (like what exactly?) and reprocess the data. Maybe one 
 of our data collection specialist should comment on that.

 BR

 -Original Message-
 From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf 
 Of Maia Cherney
 Sent: Thursday, March 03, 2011 9:13 AM
 To: CCP4BB@JISCMAIL.AC.UK
 Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule

 I have to resend my statistics.

 Maia Cherney wrote:
   
 Dear Bernhard

 I am wondering where I should cut my data off. Here is the 
 statistics from XDS processing.

 Maia

 
 On 11-03-03 04:29 AM, Roberto Battistutta wrote:
   
 Dear all,
 I got a reviewer comment that indicate the need to refine the 
 structures
 
 at an appropriate resolution (I/sigmaI of3.0), and re-submit the 
 revised coordinate files to the PDB for validation.. In the 
 manuscript I present some crystal structures determined by 
 molecular replacement using the same protein in a different space 
 group as search model. Does anyone know the origin or the 
 theoretical basis of this I/sigmaI3.0 rule for an appropriate 
 resolution?
   
 Thanks,
 Bye,
 Roberto.


 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY tel. +39.049.8275265/67 fax. 
 +39.049.8275239 roberto.battistu...@unipd.it 
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine) via Orus 2,
 35129 Padova - ITALY tel. +39.049.7923236 fax
 +39.049.7923250 www.vimm.it

 
   

   


[ccp4bb] setting up additive screen

2011-03-03 Thread m zhang




Dear all,

I am trying to optimize my crystal with additives. Since the yield of my 
protein purification is very limited, I am wondering what is the most efficient 
way to set up drops with additive to save my protein and not wasting additives? 
I am setting up 1 to 1 drops with 0.2 ul additives. But I feel 0.2ul is not 
very actuate, even if I use a p2.  Would you share your ways to set up drops 
with additives? If I want to screen some additives, what additives would you 
suggest to try first, especially those from the 96 additive conditions from 
Hampton? By the way, just wondering, what kind of p2 pipettor work better? Any 
input is greatly appreciated!

Thank you,

Min
  

[ccp4bb] AKTA Explorer and Prime need new homes [off-topic]

2011-03-03 Thread Erin Curry
Hello CCP4BB,
We have a beautiful GE AKTA Explorer and an AKTA Prime available.
Please contact me directly at ress...@gmail.com or 510-344-6633 for
more info.
Thanks,
Erin


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Bernhard Rupp (Hofkristallrat a.D.)
 I don't like Rmeas either.

Given the Angst caused by actually useful redundancy, would it not be more
reasonable then to report Rpim which decreases with redundancy? Maybe Rpim
in an additional column would help to reduce the Angst?

BR  

Maia

Tim Gruene wrote:
 Hello Maia,

 Rmerge is obsolete, so the reviewers had a good point to make you 
 publish Rmeas instead. Rmeas should replace Rmerge in my opinion.

 The data statistics you sent show a mulltiplicity of about 20! Did you 
 check your data for radiation damage? That might explain why your 
 Rmeas is so utterly high while your I/sigI is still above 2 (You 
 should not cut your data but include
 more!)

 What do the statistics look like if you process just about enough 
 frames so that you get a reasonable mulltiplicity, 3-4, say?

 Cheers, Tim

 On Thu, Mar 03, 2011 at 10:57:37AM -0700, Maia Cherney wrote:
   
 I see, there is no consensus about my data. Some people say 2.4A, 
 other say all. Well, I chose 2.3 A. My rule was to be a little bit 
 below Rmerg 100%. At 2.3A Rmerg was 98.7% Actually, I have published 
 my paper in JMB. Yes, reviewers did not like that and even made me 
 give Rrim and Rpim etc.

 Maia



 Bernhard Rupp (Hofkristallrat a.D.) wrote:
 
 First of all I would ask a XDS expert for that because I don't know 
 exactly what stats the XDS program reports (shame on me, ok) nor 
 what the quality of your error model is, or what you want to use the 
 data for (I guess refinement - see Eleanor's response for that, and use
all data).

 There is one point I'd like to make re cutoff: If one gets greedy 
 and collects too much noise in high resolution shells (like way 
 below I/sigI =
 0.8 or so) the scaling/integration may suffer from an overabundance 
 of nonsense data, and here I believe it makes sense to select a 
 higher cutoff (like what exactly?) and reprocess the data. Maybe one 
 of our data collection specialist should comment on that.

 BR

 -Original Message-
 From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf 
 Of Maia Cherney
 Sent: Thursday, March 03, 2011 9:13 AM
 To: CCP4BB@JISCMAIL.AC.UK
 Subject: Re: [ccp4bb] I/sigmaI of 3.0 rule

 I have to resend my statistics.

 Maia Cherney wrote:
   
 Dear Bernhard

 I am wondering where I should cut my data off. Here is the 
 statistics from XDS processing.

 Maia

 
 On 11-03-03 04:29 AM, Roberto Battistutta wrote:
   
 Dear all,
 I got a reviewer comment that indicate the need to refine the 
 structures
 
 at an appropriate resolution (I/sigmaI of3.0), and re-submit the 
 revised coordinate files to the PDB for validation.. In the 
 manuscript I present some crystal structures determined by 
 molecular replacement using the same protein in a different space 
 group as search model. Does anyone know the origin or the 
 theoretical basis of this I/sigmaI3.0 rule for an appropriate 
 resolution?
   
 Thanks,
 Bye,
 Roberto.


 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY tel. +39.049.8275265/67 fax. 
 +39.049.8275239 roberto.battistu...@unipd.it 
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine) via Orus 2,
 35129 Padova - ITALY tel. +39.049.7923236 fax
 +39.049.7923250 www.vimm.it

 
   

   


Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Bart Hazes
higher redundancy lowers Rpim because it increases precision. However, 
it need not increase accuracy if the observations are not drawn from the 
true distribution. If pathologic behaviour of Rfactor statistics is 
due to radiation damage, as I believe is often the case, we are 
combining observations that are no longer equivalent. If you used long 
exposures per image and collected just enough data for a complete data 
set you are out of luck. If you used shorter exposures and opted for a 
high-redundancy set then you have the option to toss out the last N 
images to get rid of the most damaged data, or you can try to compensate 
for the damage with zerodose, or whatever the name was of the program, I 
think from Wolfgang Kabsch.


Rejecting data is never desirable but I think it may be better than 
merging non-equivalent data that can't be properly modeled by a single 
structure.


Bart

On 11-03-03 12:34 PM, Bernhard Rupp (Hofkristallrat a.D.) wrote:

I don't like Rmeas either.

Given the Angst caused by actually useful redundancy, would it not be more
reasonable then to report Rpim which decreases with redundancy? Maybe Rpim
in an additional column would help to reduce the Angst?

BR

Maia

Tim Gruene wrote:

Hello Maia,

Rmerge is obsolete, so the reviewers had a good point to make you
publish Rmeas instead. Rmeas should replace Rmerge in my opinion.

The data statistics you sent show a mulltiplicity of about 20! Did you
check your data for radiation damage? That might explain why your
Rmeas is so utterly high while your I/sigI is still above 2 (You
should not cut your data but include
more!)

What do the statistics look like if you process just about enough
frames so that you get a reasonable mulltiplicity, 3-4, say?

Cheers, Tim

On Thu, Mar 03, 2011 at 10:57:37AM -0700, Maia Cherney wrote:


I see, there is no consensus about my data. Some people say 2.4A,
other say all. Well, I chose 2.3 A. My rule was to be a little bit
below Rmerg 100%. At 2.3A Rmerg was 98.7% Actually, I have published
my paper in JMB. Yes, reviewers did not like that and even made me
give Rrim and Rpim etc.

Maia



Bernhard Rupp (Hofkristallrat a.D.) wrote:


First of all I would ask a XDS expert for that because I don't know
exactly what stats the XDS program reports (shame on me, ok) nor
what the quality of your error model is, or what you want to use the
data for (I guess refinement - see Eleanor's response for that, and use

all data).

There is one point I'd like to make re cutoff: If one gets greedy
and collects too much noise in high resolution shells (like way
belowI/sigI  =
0.8 or so) the scaling/integration may suffer from an overabundance
of nonsense data, and here I believe it makes sense to select a
higher cutoff (like what exactly?) and reprocess the data. Maybe one
of our data collection specialist should comment on that.

BR

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf
Of Maia Cherney
Sent: Thursday, March 03, 2011 9:13 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: Re: [ccp4bb] I/sigmaI of3.0 rule

I have to resend my statistics.

Maia Cherney wrote:


Dear Bernhard

I am wondering where I should cut my data off. Here is the
statistics from XDS processing.

Maia



On 11-03-03 04:29 AM, Roberto Battistutta wrote:


Dear all,
I got a reviewer comment that indicate the need to refine the
structures


at an appropriate resolution (I/sigmaI of3.0), and re-submit the
revised coordinate files to the PDB for validation.. In the
manuscript I present some crystal structures determined by
molecular replacement using the same protein in a different space
group as search model. Does anyone know the origin or the
theoretical basis of this I/sigmaI3.0 rule for an appropriate
resolution?


Thanks,
Bye,
Roberto.


Roberto Battistutta
Associate Professor
Department of Chemistry
University of Padua
via Marzolo 1, 35131 Padova - ITALY tel. +39.049.8275265/67 fax.
+39.049.8275239 roberto.battistu...@unipd.it
www.chimica.unipd.it/roberto.battistutta/
VIMM (Venetian Institute of Molecular Medicine) via Orus 2,
35129 Padova - ITALY tel. +39.049.7923236 fax
+39.049.7923250 www.vimm.it








--



Bart Hazes (Associate Professor)
Dept. of Medical Microbiology  Immunology
University of Alberta
1-15 Medical Sciences Building
Edmonton, Alberta
Canada, T6G 2H7
phone:  1-780-492-0042
fax:1-780-492-7521




Re: [ccp4bb] setting up additive screen

2011-03-03 Thread Yibin Lin
Use robot. You only need 0.1ul*96=9.6ul of protein solution.




On Thu, Mar 3, 2011 at 7:59 PM, m zhang mzhang...@hotmail.com wrote:

 Dear all,

 I am trying to optimize my crystal with additives. Since the yield of my
 protein purification is very limited, I am wondering what is the most
 efficient way to set up drops with additive to save my protein and not
 wasting additives? I am setting up 1 to 1 drops with 0.2 ul additives. But I
 feel 0.2ul is not very actuate, even if I use a p2.  Would you share your
 ways to set up drops with additives? If I want to screen some additives,
 what additives would you suggest to try first, especially those from the 96
 additive conditions from Hampton? By the way, just wondering, what kind of
 p2 pipettor work better? Any input is greatly appreciated!

 Thank you,

 Min



[ccp4bb] mosflm gain

2011-03-03 Thread Bryan Lepore
wondering if mosflm can automatically estimate the gain.

i.e. i gather it is still estimated the usual way.

-Bryan


Re: [ccp4bb] Processing Laue data

2011-03-03 Thread marius . schmidt
Based on roughly 1500 complete Laue data sets
containing more than 45000 Laue patterns, 
I can say the following:
How to collect Laue data?

1.) put crystal on (capillary or cryo is fine)
2.) switch on X-rays (or Neutrons)
 for time-resolved studies select one or more pulses
 else forget pulses
2a) for time-resolved studies scan the crystal edge at reduced
Xray-flux to skim only surface that is hit by the Laser
after edge scan expose to full X-ray pulses
else forget about edge scan
3.) switch off X-rays (or Neutrons)
4.) read out pattern
5.) set crystal to another orientation
delta phi is dependent on the bandwidth 
(approx. formula available in Ren et al., 1999)
is usually 2-3 deg with 10% bandwidth 
   eg 0.1 A bandwidth at 1 A mean wavelength 
   provided by undulators
6.) goto step 2.) and repeat until reciprocal
space covered

7.) reduce data with program. in my perception,
the BEST and MOST USER FRIENDLY for X-ray Laue is: 
Precognition/Epinorm (RenzResearch)
 20 data sets per day can be achieved!!!
(I achieved once 56 data sets in 2 days).
Get a faster computer or more processors
increases speed.

8.) based on an idea by Anfinrud/Schotte step 5
can be made much more complicated with randomly filling
gaps. Very nice feature, since it fills reciprocal 
space randomly, and if crystal dies you have at
least random (but not complete) coverage of reciprocal
space.
You may also translate the crystal a bit to expose
a fresh pristine crystal volume.

9.) If it works (mosaicity can still work against you),
enjoy data that are as good as monochromatic, and 
they will give perfect maps.

10.) refine model with CNS or refmac or any other
 refinement program.


Best
Marius





 To all Laue experts up here
 How does a Laue data is collected?
 
 Thanks in advance to all
 PSP
 
 On Fri, Jan 28, 2011 at 3:43 AM, REX PALMER
 rex.pal...@btinternet.comwrote:
 
   What programs are available for processing Laue data to produce an
 intensity data set?
 Are explanatory notes or publications available?

 Rex Palmer
 Birkbeck College

 
 
 -- 
 Pius S Padayatti

Dr.habil. Marius Schmidt
Asst. Professor
University of Wisconsin-Milwaukee
Department of Physics Room 454
1900 E. Kenwood Blvd.
Milwaukee, WI 53211

phone: +1-414-229-4338
email: m-schm...@uwm.edu
http://users.physik.tu-muenchen.de/marius/


Re: [ccp4bb] Processing Laue data

2011-03-03 Thread marius . schmidt
Dear John,
of course you are right, apologies for my
little exaggeration. 

Warmest regards
Marius

 Dear Marius,
 To these two centres to which you refer we can add:-
 Diamond Light Source; contact person Prof David Allan (small molecule
 Laue X-ray crystallography);
 Institut Laue Langevin ; contact person Dr Matthew Blakeley (neutron
 Laue protein crystallography);
 Los Alamos Neutron Source; contact person Dr Paul Langan (neutron
 time-of-flight Laue protein crystallography).
 
 Also, just a historical note; the SRS wiggler 9 Laue effort I
 deliberately recentred at ESRF ID09 when Michael Wulff joined ESRF
 and
 set about, with SAC and community approval, buidling ID09.
 
 Yours sincerely,
 John
 
 On Wed, Mar 2, 2011 at 12:14 AM, Marius Schmidt
 marius.schm...@ph.tum.de wrote:
 there is a small but brave community that actually
 attempted to collect Laue data on proteins with
 modern synchrotron sources. They are all centered
 around Keith Moffat in Chicago and Michael Wulff
 in Grenoble. Maybe you contact these people:
 D. Bourgeois at the ESRF, V. Srajer or Z. Ren at the
 APS.

 Best
 Marius

 What programs are available for processing Laue data to produce an
 intensity data set?
 Are explanatory notes or publications available?

 Rex Palmer
 Birkbeck College
 Dr.habil. Marius Schmidt
 Asst. Professor
 University of Wisconsin-Milwaukee
 Department of Physics Room 454
 1900 E. Kenwood Blvd.
 Milwaukee, WI 53211

 phone: +1-414-229-4338
 email: m-schm...@uwm.edu
 http://users.physik.tu-muenchen.de/marius/

 
 
 -- 
 Professor John R Helliwell DSc

Dr.habil. Marius Schmidt
Asst. Professor
University of Wisconsin-Milwaukee
Department of Physics Room 454
1900 E. Kenwood Blvd.
Milwaukee, WI 53211

phone: +1-414-229-4338
email: m-schm...@uwm.edu
http://users.physik.tu-muenchen.de/marius/


Re: [ccp4bb] mosflm gain

2011-03-03 Thread David Waterman
Usually Mosflm will use a default value for the gain that depends on the
type of detector used. This value is not realistic for CCD detectors, that
is it is not really equal to the ratio of ADUs to incident X-ray photons,
however it satisfies typical images under the assumptions of pixel
independence and Poisson distribution, which are not true either. Inasmuch
as the gain is just a scale factor in the data, it doesn't really matter
that it isn't physically meaningful in the way you might expect from its
name. However, the procedure of calculating gain from the variance to mean
ratio from a background region of the image, which is the only simple
automatic approach available if all you have is an image, should be avoided
if you are looking for the gain in real units.

I realise that didn't answer the question, but I thought it might be worth
pointing out.


-- David


On 3 March 2011 20:34, Bryan Lepore bryanlep...@gmail.com wrote:

 wondering if mosflm can automatically estimate the gain.

 i.e. i gather it is still estimated the usual way.

 -Bryan



Re: [ccp4bb] I/sigmaI of 3.0 rule

2011-03-03 Thread Ingo P. Korndoerfer
not sure whether this option has been mentioned before ...

i think what we really would like to do is decide by the quality of the
density. i see that this is difficult.

so, short of that ... how about the figure of merit in refinement ?

wouldn't the fom reflect how useful our data really are ?

ingo


On 03/03/2011 12:29, Roberto Battistutta wrote:
 Dear all,
 I got a reviewer comment that indicate the need to refine the structures at 
 an appropriate resolution (I/sigmaI of 3.0), and re-submit the revised 
 coordinate files to the PDB for validation.. In the manuscript I present 
 some crystal structures determined by molecular replacement using the same 
 protein in a different space group as search model. Does anyone know the 
 origin or the theoretical basis of this I/sigmaI 3.0 rule for an 
 appropriate resolution?
 Thanks,
 Bye,
 Roberto.


 Roberto Battistutta
 Associate Professor
 Department of Chemistry
 University of Padua
 via Marzolo 1, 35131 Padova - ITALY
 tel. +39.049.8275265/67
 fax. +39.049.8275239
 roberto.battistu...@unipd.it
 www.chimica.unipd.it/roberto.battistutta/
 VIMM (Venetian Institute of Molecular Medicine)
 via Orus 2, 35129 Padova - ITALY
 tel. +39.049.7923236
 fax +39.049.7923250
 www.vimm.it