[ccp4bb] I222 - P22121 space-group ambiguity

2014-10-13 Thread Florian Schmitzberger
Hi everybody,

I collected a number of X-ray data sets from crystals originating from the same 
cryst. drop. I solved the initial structure in P22121 space group by MR with 
Phaser locating two molecules (data to ~ 2.1 Angstr.); refined R/Rfree: 
0.213/0.244.

Processing of some of the other data sets with XDS/Aimless is consistent with 
I222 space group (resolution ~ 2.6 Ang.). I can locate one molecule. The 
unit-cell dimensions for I222 and the initial P22121 space groups for two of 
the data sets are:
I222: a=87.8 b=101.18 c=123.63; P22121: a=93.34 b=105.47 c=122.98;

I superposed the molecule in I222 onto one of the two located for the initially 
solved P22121; the orientation of the NCS-related molecule in P22121 differs 
from the crystallographic-symmetry related one in I222. Trying to solve this 
P22121 data set in I222 with MR, does not result in high Z scores, and maps do 
not look good.

Some of the data sets that process in I222 to ~ 3 Angstr., I can also solve in 
P22121, locating two molecules (differences may not be that clear in this case, 
since the resolution is lower).

Some other data sets process in P22121 with Aimless; with a substantial 
off-origin Patterson peak, indicating translational NCS. For these, Phaser 
positions two molecules that are related by crystallographic translational NCS. 
These two molecules are crystallographic-symmetry related in the original 
P22121 data set. I can also solve these data sets in I222 space group, with the 
overall Z score higher than for the P22121 data. 

I am uncertain, what the ‘true’ space group for some of my data sets is. Could 
it be that for data that process in P22121, but can be solved in I222, 
reflections that would indicate I222 space group were not collected? 
Alternatively, perhaps what I am seeing is that there is a (gradual) transition 
of the crystal lattice (between P22121 and I222 or vice versa), caused by 
variation in crystal handling/cooling or exposure to X-rays.

It’s relevant to me, because in P22121 space group, a region of the molecule 
that is of biological interest makes NCS-related crystal contacts that are 
crystallographic-symmetry related in I222.

Has anybody observed similar cases? I would appreciate comments.

Cheers,

Florian




Re: [ccp4bb] project and literature organization software (laboratory information management software)

2014-04-29 Thread Florian Schmitzberger
Hi Tobias,

There is Quartzy, which is free.

https://www.quartzy.com

I am not sure it covers all of your desired functionalities.

Best regards,

Florian

On Apr 29, 2014, at 7:21 AM, Tobias Beck tobiasb...@gmail.com wrote:

 Dear all,
 
 I am looking for a software solution to organize many pieces of information
 
 1.) Results from (bio)chemical experiments, such as spectral data, pictures.
 
 2.) Project ideas, milestones, etc.
 
 3.) Literature, including tags, short comments, etc. 
 
 For example, for a certain project I would like to collect information about 
 experiments conducted, then link this to literature/literature experiments 
 and to project outlines. All this should be accessible for multiple users on 
 different OS. 
 
 I have briefly looked into PiMS (too much crystallography oriented), Contor 
 ELN (only on Safari on Mac?), Labguru (nice, but not too flexible and mostly 
 for biosciences) and Confluence (nice wiki, but so far no real literature 
 plugin).
 
 I know that this sounds maybe a little bit like something called in German a 
 'eierlegende Wollmilchsau' 
 http://en.wiktionary.org/wiki/eierlegende_Wollmilchsau 
 
 But I would be happy to hear about what software people (and labs) have 
 tried, liked/disliked and 
 ideally the reasons. 
 
 (I am aware that there was a similar query 
 https://www.mail-archive.com/ccp4bb@jiscmail.ac.uk/msg24657.html, but this 
 was more than 2 years ago)
 
 Thanks a lot!
 
 Best wishes, Tobias.
 
 -- 
 ___
 
 Dr. Tobias Beck
 ETH Zurich
 Laboratory of Organic Chemistry
 Vladimir-Prelog-Weg 3, HCI F 322
 8093 Zurich, Switzerland
 phone:  +41 44 632 68 65
 fax:+41 44 632 14 86
 web:  http://www.protein.ethz.ch/people/tobias
 ___
 

---
Florian Schmitzberger, PhD
Biological Chemistry and Molecular Pharmacology
Harvard Medical School
250 Longwood Avenue, Seeley G. Mudd 127
Boston, MA 02115, USA
Tel: 001 617 432 5601



[ccp4bb] Off-topic: 96-well plate PCR/plasmid purification

2012-06-29 Thread Florian Schmitzberger

Dear All,

I am looking for a bit of advice, and am interested to hear about  
experiences people have had with various commercially-available 96- 
well plate-based PCR, plasmid, and protein purification appliances and  
plates. Forgive me for comparing commercial vendors.


We have a 96-well plate-compatible vacuum manifold (Qiavac). However,  
the plasmid-purification kits from this vendor do seem comparatively  
pricey to me. So, I am looking for a more cost-efficient system. How  
compatible are 96-well PCR purification plates/plasmid-binding and  
filter plates from different vendors with this type of vacuum  
manifold? From what I can see, it becomes difficult to avoid cross- 
contamination between wells, when using e.g. Whatman-type 96-well  
microplates with Qiavac, because the distance between the drip  
director of the DNA-binding/filter plate and a receiver plate (e.g.  
PCR plate) in the manifold is relatively long, and the alignment isn't  
great (drops end up on the rim). The alternative would be  
centrifugation directly on top of a receiver plate, but I am concerned  
about cross-contamination.


Which type of membranes/filter plates were people most pleased with  
for (bacterial) lysate-clearing, avoiding clogging with lysates from ~  
5 ml bacterial cultures; and which DNA/plasmid-binding plates have  
most binding capacity and result in highest purity? Again cost- 
efficient vendors would be preferred.


Regards,

Florian

















Re: [ccp4bb] The effect of His-tag location on crystallization

2012-06-27 Thread Florian Schmitzberger
Human leukotriene C4 synthase (PDB accession code: 2UUI) is another  
example, illustrating how an N-terminal polyhistidine-tag, in  
conjunction with metals, presumably facilitated crystallization.


On Jun 27, 2012, at 12:04 PM, Brad Bennett wrote:

I think it was an N-terminal RGS-type His tag in 3O8Y (human  
lipoxygenase) that mediated crystal contacts with a symmetry related  
molecule. As I recall, this tag composed a B-strand that formed a  
nice interface with a native B-strand of the symmetry related  
molecule. Pretty cool...


-Brad

On Wed, Jun 27, 2012 at 11:00 AM, Phoebe Rice pr...@uchicago.edu  
wrote:
With Flp recombinase - DNA complexes, a C-terminal His tag triggered  
a different (but sadly not better) crystal form, and the His side  
chains packed against the bases at the end of a neighboring DNA  
duplex.


=
Phoebe A. Rice
Dept. of Biochemistry  Molecular Biology
The University of Chicago
phone 773 834 1723
http://bmb.bsd.uchicago.edu/Faculty_and_Research/01_Faculty/01_Faculty_Alphabetically.php?faculty_id=123
http://www.rsc.org/shop/books/2008/9780854042722.asp


 Original message 
Date: Wed, 27 Jun 2012 10:14:58 -0400
From: CCP4 bulletin board CCP4BB@JISCMAIL.AC.UK (on behalf of R.  
M. Garavito rmgarav...@gmail.com)
Subject: Re: [ccp4bb] The effect of His-tag location on  
crystallization

To: CCP4BB@JISCMAIL.AC.UK

   Most of the comments you will get will be anecdotal
   in that people will report the successful results
   and do not take the time or effort to characterize
   the less successful results.  This often occurs
   because the tagged portion of the protein is most
   often disordered, even in the best crystals.  Thus,
   other than saying tagging on this end works, but
   tagging on that end doesn't, there is little more
   you can say.  Each case will be different, and it is
   almost impossible to arrive at any generalized
   conclusion.
   We prefer C-terminal tagged proteins for a number of
   reasons, but if an N-terminally tagged protein
   crystallizes well, so be it.  Of the dozens of N-
   and C-tagged protein structures we have solved in my
   lab and with collaborators, I have only seen one
   case of an ordered His-tag:  the His residues had
   coordinated Cd ions, which proved essential for
   getting good crystals.  However, beyond that there
   was not much more to say.
   For your protein and the resulting crystals, an
   N-terminally tagged protein crystallized well.
Whether you can draw any more conclusions from
   these results depends on characterizing crystals of
   both N- and C-tagged proteins.  Just assuming that
   the C-tagged protein is trying to crystallize in the
   same or related crystal form as the N-tagged protein
   is an unwarranted assumption without experimental
   evidence to back it up.  That is why most groups
   just run with the winner.
   Cheers,
   Michael
   
   R. Michael Garavito, Ph.D.
   Professor of Biochemistry  Molecular Biology
   603 Wilson Rd., Rm. 513
   Michigan State University
   East Lansing, MI 48824-1319
   Office:  (517) 355-9724 Lab:  (517) 353-9125
   FAX:  (517) 353-9334
Email:  rmgarav...@gmail.com
   
   On Jun 26, 2012, at 9:06 PM, weliu wrote:

 Dear all,

 We crystallized a protein and found that crystal
 quality greatly depended on the location of
 His-tag. When a His-tag was added at the
 C-terminus, only crystalline precipitate or
 spherical quasi crystals were grown. However, when
 the His-tag was moved to the N-terminus, single
 crystals were grown under a number of conditions,
 and the best one diffracted to 1.7 angstrom after
 optimization. I was wondering if there were
 published reports describing similar cases.

 Thank you in advance

 Wei Liu

















[ccp4bb] Off-topic: PDB deposition of multiple structure factor files

2012-04-27 Thread Florian Schmitzberger

Dear All,

With my most recent PDBe deposition, in addition to the native data, I  
had intended to deposit the anomalous data, used for structure  
determination, and make it available for download. This turned out to  
be less straightforward than I had anticipated, because the current  
PDB convention is to only allow a single structure factor file for  
experimental data (usually the native dataset), available for download  
from the PDB. In my case, the anomalous data were concatenated with  
the native data into a single cif file (this worked and made sense,  
because both for both datasets the unit cell dimensions are virtually  
identical).


I imagine it would be beneficial to be able to make available more  
than a single structure factor file, including the ones derived from  
experimental phasing, in the PDB, along with the final coordinates,  
without concatenating the data into a single file (which may lead to  
confusion to users when downloaded). Is this anything the PDB is  
already working to implement in the near future (perhaps via the  
coming PDBx format)?


Best regards,

Florian














Re: [ccp4bb] Off-topic: PDB deposition of multiple structure factor files

2012-04-27 Thread Florian Schmitzberger

Dear Mark,

This is interesting. I also had submitted my data via the PDBe  
(European portal). While they allow deposition of multiple datasets,  
only a single file can apparently be made available for download from  
the site. In contrast to your case, for my deposition the second  
deposited dataset is not explicitly listed though.


Cheers,

Florian

On Apr 27, 2012, at 2:35 PM, Mark J van Raaij wrote:


again, it looks like this is particular to the US portal.
We submit via the European www.pdbe.org and can submit multiple  
datasets.

See 2XGF for an example.
Note: I think from www.rcsb.org only one file can be downloaded, but www.pdbe.org 
 clearly shows both.
Although you are in the US, you can use the pdbe deposition tool  
AUTODEP - or the Japanese one, if you like.


Mark J van Raaij
Laboratorio M-4
Dpto de Estructura de Macromoleculas
Centro Nacional de Biotecnologia - CSIC
c/Darwin 3
E-28049 Madrid, Spain
tel. (+34) 91 585 4616
http://www.cnb.csic.es/~mjvanraaij



On 27 Apr 2012, at 20:23, Florian Schmitzberger wrote:


Dear All,

With my most recent PDBe deposition, in addition to the native  
data, I had intended to deposit the anomalous data, used for  
structure determination, and make it available for download. This  
turned out to be less straightforward than I had anticipated,  
because the current PDB convention is to only allow a single  
structure factor file for experimental data (usually the native  
dataset), available for download from the PDB. In my case, the  
anomalous data were concatenated with the native data into a single  
cif file (this worked and made sense, because both for both  
datasets the unit cell dimensions are virtually identical).


I imagine it would be beneficial to be able to make available more  
than a single structure factor file, including the ones derived  
from experimental phasing, in the PDB, along with the final  
coordinates, without concatenating the data into a single file  
(which may lead to confusion to users when downloaded). Is this  
anything the PDB is already working to implement in the near future  
(perhaps via the coming PDBx format)?


Best regards,

Florian














---
Florian Schmitzberger, PhD
Biological Chemistry and Molecular Pharmacology
Harvard Medical School
250 Longwood Avenue, Seeley G. Mudd 123
Boston, MA 02115, US
Tel: 001 617 432 5603















Re: [ccp4bb] Substitution to glycerol during crystallogenesis

2012-04-03 Thread Florian Schmitzberger

Dear Toby,

I don't think there is a basic problem using glycerol in  
crystallization. Glycerol will affect the vapour pressure (if it is  
not present in the well/precipitant solution) and 10 % glycerol is ~  
1.3 molar concentration. During equilibration the drops may increase  
in volume, decreasing the protein concentration. Thus, when using  
glycerol I think it is generally beneficial to start with a high  
protein concentration. Perhaps, you can concentrate your protein  
sample further.


I have on several occasions observed immediate precipitation upon  
mixing protein solution (containing glycerol) and precipitant  
solution; drops then cleared up after a short period of time (and  
crystals eventually formed). In this case, the crystallization  
experiment starts in the supersaturated zone, and moves towards an  
undersaturated concentration, traversing the (metastable) zone where  
nucleation and crystallization can happen (rather than the other way  
around, which seems the more traditional approach with crystallization  
by vapour diffusion).


Enrico Stura published a recent article, describing an effect of  
glycerol on crystallization. Vera,L., Czarny, B., Georgiadis, D.,  
Dive, V., Stura, E.A. (2011) Practical Use of Glycerol in Protein  
Crystallization. Cryst. Growth  Des. 11 :2755–2762. 


You could replace glycerol with ethylenglycol or a small molecular  
weight PEG (e.g. 400), which may also have a stabilizing effect on  
your complex.


Regards,

Florian

On Apr 3, 2012, at 7:49 AM, Toby Longwood wrote:


Dear all,

My question is related to a sample preparation.

I’m working with a complex that can be stabilized with glycerol (at  
least 10%) during purification. The use of detergents does not help.  
After purification, the sample is homogeneous (EM) and can be  
concentrated (3-4mg.mL-1) . I already set up many drops, changing  
several conditions (pH, salt...) but nothing conclusive appeared.


I know that crystallogenesis in presence of glycerol works (Sousa,  
Acta Cryst (1995), ...) however, because of the aspect of the drops  
(precipitates that seem close to the nucleation phase), I suspect  
that the glycerol can be one of the limiting factors of the protocol.


Has anybody else been already confronted to the same problem? Does  
someone know if there is an alternate additive to glycerol?


Thanks in advance for suggestions/help

With best wishes



Toby


















[ccp4bb] xds - problem with reference profile

2012-03-08 Thread Florian Schmitzberger

Dear All,

I am getting a warning message in XDS I have not seen before, when  
trying to integrate a low resolution (~ 7 A) dataset.


!!! WARNING !!! REFERENCE PROFILE #  1 IS EMPTY.
 THE AVERAGE PROFILE IN THIS BATCH IS USED INSTEAD.

and so forth for the other reference profiles.

Indexing works alright with XDS; with likely point group C2. Scaling  
the data integrated with XDS, with Scala fails; apparently most of the  
reflections are rejected.


What are probable causes for the above error message?

Cheers,

Florian









Re: [ccp4bb] Reasoning for Rmeas or Rpim as Cutoff

2012-01-30 Thread Florian Schmitzberger
On Jan 30, 2012, at 10:28 AM, Jacob Keller wrote:

 I'm intrigued:  how come this apparently excellent idea has not become
 standard best practice in the 14 years since it was published?
 
 It would seem because too few people know about it, and it is not
 implemented in any software in the usual pipeline. Maybe it could be?

Phenix.model_vs_data calculates a sigmaA_ vs resolution plot (in comprehensive 
validation in the GUI). Pavel would probably have replied by now, but I don't 
think the discussion has been cross-posted to the phenix bb.

Cheers,

Florian




 
 Perhaps the way to do it would be always to integrate to
 ridiculously-high resolution, give that to Refmac, and starting from
 lower resolution, to iterate to higher resolution according the most
 recent sigma a calculation, and cutoff according to some reasonable
 sigma a value?
 
 JPK
 
 
 
 
 phx
 
 
 
 On 30/01/2012 09:40, Randy Read wrote:
 
 Hi,
 
 Here are a couple of links on the idea of judging resolution by a type of
 cross-validation with data not used in refinement:
 
 Ling et al, 1998: http://pubs.acs.org/doi/full/10.1021/bi971806n
 Brunger et al,
 2008: http://journals.iucr.org/d/issues/2009/02/00/ba5131/index.html
   (cites earlier relevant papers from Brunger's group)
 
 Best wishes,
 
 Randy Read
 
 On 30 Jan 2012, at 07:09, arka chakraborty wrote:
 
 Hi all,
 
 In the context of the above going discussion can anybody post links for a
 few relevant articles?
 
 Thanks in advance,
 
 ARKO
 
 On Mon, Jan 30, 2012 at 3:05 AM, Randy Read rj...@cam.ac.uk wrote:
 
 Just one thing to add to that very detailed response from Ian.
 
 We've tended to use a slightly different approach to determining a
 sensible resolution cutoff, where we judge whether there's useful
 information in the highest resolution data by whether it agrees with
 calculated structure factors computed from a model that hasn't been refined
 against those data.  We first did this with the complex of the Shiga-like
 toxin B-subunit pentamer with the Gb3 trisaccharide (Ling et al, 1998).
  From memory, the point where the average I/sig(I) drops below 2 was around
 3.3A.  However, we had a good molecular replacement model to solve this
 structure and, after just carrying out rigid-body refinement, we computed a
 SigmaA plot using data to the edge of the detector (somewhere around 2.7A,
 again from memory).  The SigmaA plot dropped off smoothly to 2.8A
 resolution, with values well above zero (indicating significantly better
 than random agreement), then dropped suddenly.  So we chose 2.8A as the
 cutoff.  Because there were four pentamers in the asymmetric unit, we could
 then use 20-fold NCS averaging, which gave a fantastic map.  In this case,
 the averaging certainly helped to pull out something very useful from a very
 weak signal, because the maps weren't nearly as clear at lower resolution.
 
 Since then, a number of other people have applied similar tests.  Notably,
 Axel Brunger has done some careful analysis to show that it can indeed be
 useful to take data beyond the conventional limits.
 
 When you don't have a great MR model, you can do something similar by
 limiting the resolution for the initial refinement and rebuilding, then
 assessing whether there's useful information at higher resolution by using
 the improved model (which hasn't seen the higher resolution data) to compute
 Fcalcs.  By the way, it's not necessary to use a SigmaA plot -- the
 correlation between Fo and Fc probably works just as well.  Note that, when
 the model has been refined against the lower resolution data, you'll expect
 a drop in correlation at the resolution cutoff you used for refinement,
 unless you only use the cross-validation data for the resolution range used
 in refinement.
 
 -
 Randy J. Read
 Department of Haematology, University of Cambridge
 Cambridge Institute for Medical ResearchTel: +44 1223 336500
 Wellcome Trust/MRC Building Fax: +44 1223 336827
 Hills Road
  E-mail: rj...@cam.ac.uk
 Cambridge CB2 0XY, U.K.
 www-structmed.cimr.cam.ac.uk
 
 On 29 Jan 2012, at 17:25, Ian Tickle wrote:
 
 Jacob, here's my (personal) take on this:
 
 The data quality metrics that everyone uses clearly fall into 2
 classes: 'consistency' metrics, i.e. Rmerge/meas/pim and CC(1/2) which
 measure how well redundant observations agree, and signal/noise ratio
 metrics, i.e. mean(I/sigma) and completeness, which relate to the
 information content of the data.
 
 IMO the basic problem with all the consistency metrics is that they
 are not measuring the quantity that is relevant to refinement and
 electron density maps, namely the information content of the data, at
 least not in a direct and meaningful way.  This is because there are 2
 contributors to any consistency metric: the systematic errors (e.g.
 differences in illuminated volume and absorption) and the random
 errors (from counting statistics, detector noise etc.).  If the data
 are collected with 

Re: [ccp4bb] detect dsDNA

2011-10-02 Thread Florian Schmitzberger
There exists a less toxic chemical than EtBr to stain DNA: SYBR safe  
DNA stain (a fluorescence dye sold by a certain vendor). Another  
benefit is to be able to use blue light, reducing UV/VIS light  
exposure when handling gels.


Florian

On Oct 2, 2011, at 11:49 AM, Edward A. Berry wrote:


Jacob Keller wrote:

I actually looked at an EtBr MSDS a while ago, and was shocked at how
benign it was. I also heard from someone that they used to feed it to
Argentinian cows routinely a few years back...



Wikipedia says it was used as a trypanosomacidal - It's being
discontinued not because of toxicity to beast or man, but because
of insufficient toxicity to trypanosomes- the little buggers
are developing resistance. Of course resistance would develop
earlier if EtBr is mutagenic. Maybe they overexpress
DNA repair enzymes.


[ccp4bb] Off topic: vector map editing and DNA sequence alignment software

2011-09-27 Thread Florian Schmitzberger

Dear All,

What type of software are people commonly using these days for vector/ 
plasmid map editing, making/visualizing vector maps, and aligning  
(small to medium size) DNA sequencing data? Preferably, it should not  
be too expensive and be able to write text files, readable by other  
programs.


I am familiar with VectorNTI, which is great for vector visualization  
and editing; but I find it somewhat expensive. Sequencher seems good  
to quickly align DNA sequences (such as from DNA sequencing) with  
templates, but is not free. I have been using ApE for while for  
alignments, but aligning many sequences is more cumbersome than in  
Sequencher; I have not tested if Sequencher is good at visualizing and  
editing plasmid maps.


Ideally, I would like to have a single program for both purposes  
(vector editing and DNA sequence comparison). Does something like that  
exist? What are the alternatives to above programs?


Thank you in advance.

Florian

---
Florian Schmitzberger, PhD
Biological Chemistry and Molecular Pharmacology
Harvard Medical School
250 Longwood Avenue, Seeley G. Mudd 123
Boston, MA 02115, US
Tel: 001 617 432 5603















Re: [ccp4bb] Electostatic surface at various pH

2011-08-19 Thread Florian Schmitzberger

Hi John,

I would probably use Pdb2pqr, to assign charges and radii for the  
atoms at different pHs, and then APBS integrated in Pymol software for  
visualization:


http://kryptonite.nbcr.net/pdb2pqr/
http://www.poissonboltzmann.org/apbs/

Hope this helps,

Florian

On Aug 19, 2011, at 2:35 PM, john peter wrote:


Hi All,

Apologies for this  slight off topic, however this could very well be
the best bulletin board to seek help.

I need to calculate the electrostatic surface and make surface figures
for a protein at various pHs, say 4.0, 7.0  9.0. I'm doing this kind
of wok for the first time. Could you suggest me a user-friendly
software, tutorials and tips.

Sincere thanks and appreciation.

John


Re: [ccp4bb] XDS problem: REMOVE.HKL ignored?

2011-07-25 Thread Florian Schmitzberger

Hi Engin,

I encountered the same issue a couple of months ago. As I understand,  
the REMOVE.HKL file will only be used if you specify the spacegroup  
and unit cell in the XDS.INP file.


Cheers,

Florian

On Jul 25, 2011, at 1:14 PM, Engin Özkan wrote:


Hi all,

After about a year of not working with XDS, I was a little surprised  
to see that adding a REMOVE.HKL file to the current directory and  
running CORRECT does not remove outliers. I still see the same  
reflections reported at the end of my new CORRECT.LP file and I have  
grepped the XDS_ASCII.HKL file to observe, to my dismay, that the  
removed are still present. Since I do not see this reported and  
this must be a daily used option by many of you out there, I am  
guessing that I must be doing something wrong, but it escapes me.


For my and anybody else's sanity, here is my REMOVE.HKL file:
$ more REMOVE.HKL
  21   15  -172.10  963.70  0.1638E+06  0.9804E+04 alien
   39  -272.17   58.11  0.2972E+04  0.1788E+03 alien
   3   11  -262.17   32.12  0.1643E+04  0.1002E+03 alien
  259  -202.04   11.89  0.3647E+03  0.1498E+02 alien
   24  -331.87   10.88  0.1215E+03  0.8789E+01 alien
  139  -192.71   10.69  0.1397E+04  0.4445E+02 alien
  177  -291.99   10.65  0.2353E+03  0.1573E+02 alien
  16   14  -291.83   10.58  0.8817E+02  0.9706E+01 alien
  18   12  -281.90   10.16  0.1566E+03  0.1325E+02 alien

I am using the most recently updated version of xds on a Scientific  
Linux 6.0 (64-bit) machine.


Hey, I just figured out that using the December 6, 2010 version  
installed back in January on a different 64-bit linux machine, the  
REMOVE.HKL file was successfully used. So this may be a problem with  
the recent bug-fix update (which we have through SBGrid).


All clues, hits, pointers are appreciated.

Best,

Engin


[ccp4bb] merge datasets

2011-03-04 Thread Florian Schmitzberger

Dear All,

I have two questions:

1) I have collected multiple, native datasets (5) from the same  
crystal (different parts of the crystals exposed with different  
transmission and oscillation angles). Each dataset on its own is close  
to complete (96-98 %). Naturally, differences in exposure, onset of  
radiation damage (datasets were collected with high transmission),  
local differences in the crystal, will affect the variance of the  
errors in the measurements for the reflections between the different  
datasets; but I would think the redundancy and increased number of  
measurements from all datasets should outweigh this. My tendency is to  
include all datasets.


I am working at 3.8 A resolution (structure is solved; 80 % solvent  
content).  Missing even a few reflection will probably have more of an  
impact at this lower than at higher resolution. Essentially, I am  
trying to obtain better signal in the resolution range 4-3.8 A, where  
there is also diffuse scattering from the solvent between 4 and 3 A  
and ice rings and the signal from the individual datasets is weak.  
Obviously the criterion will be the calculated map quality, but wanted  
to know what some experiences of people have been in such cases.  
Should I merge the datasets or rather use them individually for map  
calculations?


2) What's the quickest/easiest way to ensure equivalent indexing in  
ccp4/imosflm/scala, when merging different datasets together (space  
group P6222) (in XDS there is reference_data_set). Use pointless then  
cad+scala?


Cheers,

Florian


[ccp4bb] Effect of NCS on estimate of data:parameter ratio

2010-09-18 Thread Florian Schmitzberger

Dear All,

I would have a question regarding the effect of non-crystallographic  
symmetry (NCS) on the data:parameter ratio in refinement.


I am working with X-ray data to a maximum resolution of 4.1-4.4  
Angstroem, 79 % solvent content, in P6222 space group; with 22 300  
unique reflections and expected 1132 amino acid residues in the  
asymmetric unit, proper 2-fold rotational NCS (SAD phased and no high- 
resolution molecular replacement or homology model available).


Assuming refinement of x,y,z, B and a polyalanine model (i.e. ca. 5700  
atoms), this would equal an observation:parameter ratio of roughly  
1:1. This I think would be equivalent to a normal protein with 50 %  
solvent content, diffracting to better than 3 Angstroem resolution  
(from the statistics I could find, at that resolution a mean  
data:parameter ratio of ca. 0.9:1 can be expected for refinement of  
x,y,z, and individual isotropic B; ignoring bond angle/length  
geometrical restraints at the moment).


My question is how I could factor in the 2-fold rotational NCS for the  
estimate of the observations, assuming tight NCS restraints (or even  
constraint). It is normally assumed NCS reduces the noise by a factor  
of the square root of the NCS order, but I would be more interested  
how much it adds on the observation side (used as a restraint) or  
reduction of the parameters (used as a constraint). I don't suppose it  
would be correct to assume that the 2-fold NCS would half the number  
of parameters to refine (assuming an NCS constraint)?


Regards,

Florian

---
Florian Schmitzberger
Biological Chemistry and Molecular Pharmacology
Harvard Medical School
250 Longwood Avenue, SGM 130
Boston, MA 02115, US
Tel: 001 617 432 5602


[ccp4bb] Problem with NCS detection in Parrot

2010-09-03 Thread Florian Schmitzberger

Dear All,

I am encountering a problem when using Parrot (for combined density  
modification and non crystallographic symmetry (NCS) averaging) in  
ccp4 6.1.13, run via ccp4i.


Parrot does not detect the (2-fold) NCS present among my heavy atom  
substructure with 20 seleniums (the pdb was output by Phaser-EP,  
single chain ID, and is read by Parrot from what I can tell). I have  
tried a to split the NCS related heavy atoms into separate chains, but  
Parrot does still not appear to detect any NCS (error message:  
WARNING: No NCS found from heavy atoms).


The Professs program seems to detect the NCS readily. Unfortunately, I  
don't think it is possible to input externally determined NCS  
operators into Parrot.


Regards,

Florian

---
Florian Schmitzberger
Biological Chemistry and Molecular Pharmacology
Harvard Medical School
250 Longwood Avenue, SGM 130
Boston, MA 02115, US
Tel: 001 617 432 5602


[ccp4bb] Format conversion of Shelx coordinate file

2010-08-30 Thread Florian Schmitzberger

Dear All,

What is currently the quickest/easiest way to convert a .hat file with  
fractional coordinates of heavy atoms generated by ShelxE to PDB  
format and/or a file format accepted by Sharp?


I tried to use coordconv from ccp4, but it failed to make the  
conversion.


Thank you.

Regards,

Florian

---
Florian Schmitzberger
Biological Chemistry and Molecular Pharmacology
Harvard Medical School
250 Longwood Avenue, SGM 130
Boston, MA 02115, US
Tel: 001 617 432 5602


[ccp4bb] Reindex and Rfree column

2007-11-15 Thread Florian Schmitzberger

Dear All,

I am wondering whether the Free_R column of an mtz file is altered by  
using  Reindex (and/or Cad). The way I understand it, reindexing  
does not affect the Free_R column, correct?


I have solved a structure with a reindexed dataset (P21212 from  
P22121). Now I have datasets of ligand-bound forms of the protein,  
which I would like to reindex (in the same way as the initial apo- 
datasets) and copy the Free_R from the initial dataset.


What I am concerned about is whether I will get the same reflections  
Rfree flagged in the ligand-bound dataset, as I did not resort  
reflections with Cad after the reindexing of the initial apo- 
dataset. The procedure I chose now is to merge and scale, then  
reindex the ligand-datasets (with Reindex), and then copy the Rfree  
from the original dataset (with Uniqueify); (i.e. no resorting of  
the reflections after reindex). Perhaps, I should have sorted the  
reflections after the reindexing in both apo- and ligand datasets.


Thank you very much in advance for any comments!

Florian




[ccp4bb] Reindex and Rfree column

2007-11-15 Thread Florian Schmitzberger

Dear All,

I am wondering whether the Free_R column of an mtz file is altered by  
using  Reindex (and/or Cad). The way I understand it, reindexing  
does not affect the Free_R column, correct?


I have solved a structure with a reindexed dataset (P21212 from  
P22121). Now I have datasets of ligand-bound forms of the protein,  
which I would like to reindex (in the same way as the initial apo- 
datasets) and copy the Free_R from the initial dataset.


What I am concerned about is whether I will get the same reflections  
Rfree flagged in the ligand-bound dataset, as I did not resort  
reflections with Cad after the reindexing of the initial apo- 
dataset. The procedure I chose now is to merge and scale, then  
reindex the ligand-datasets (with Reindex), and then copy the Rfree  
from the original dataset (with Uniqueify); (i.e. no resorting of  
the reflections after reindex). Perhaps, I should have sorted the  
reflections after the reindexing in both apo- and ligand datasets.


Thank you very much in advance for any comments!

Florian





[ccp4bb] arp/warp in p22121

2007-09-18 Thread Florian Schmitzberger
Dear All,

I am trying to build a molecular replacement model in arp/warp in space group 
P22121. 
Refmac alone seems to be fine with refining the model in P22121; but arp/warp 
fails, as far 
as I can see at the first Refmac refinement stage. In the log-file it says 
this space group is 
not supported.

I am wondering whether arp/warp needs the Hermann-Mauguin convention space 
group 
P21212.  I suppose I will need to reindex in P21212 in order to use arp/warp? 
(the diffraction 
data were indexed in XDS, scaled in SCALA, and then run through CAD to change 
the space 
group from P222 to P22121). I am using arp/warp via the ccp4i interface.

Also, arp/warp gives the following message when I load the mtz file cannot 
extract 
arp/warp asymmetric unit limits, the job will fail if run. (i did run arp/warp 
successfully with 
other mtz-files).

Thank you in advance for any comments!

Florian