[ccp4bb] Postdoctoral position at Exeter

2011-02-08 Thread Littlechild, Jennifer
Please find a vacancy for a postdoctoral position available at Exeter to work 
on an exciting multidisciplinary EU grant called 'Hotzyme'
Details can be found at 
http://admin.exeter.ac.uk/personnel/jobs.php?action=jobareaid=4jid=5163
Professor Jenny Littlechild

Professor Jenny Littlechild
Prof. Biological Chemistry
Director Exeter Biocatalysis Centre
Henry Wellcome Building for Biocatalysis
Stocker Road
Exeter
EX4 4QD, UK
Tel: 44 (0) 1392 263468
Fax: 44 (0) 1392 263489
Email: j.a.littlech...@exeter.ac.ukmailto:j.a.littlech...@exeter.ac.uk
www.exeter.ac.uk/bioscienceshttp://www.exeter.ac.uk/biosciences
http://centres.exeter.ac.uk/biocatalysis




[ccp4bb] x-ray sensitive dye?

2011-02-08 Thread Richard Edward Gillilan
Does anyone know of a water soluble dye that changes color upon exposure to 
x-rays?

Preferably from clear to a dark color. Must work in the liquid state 
(non-frozen... so color centers are out). 
Doesn't matter if it is organic or inorganic. 


Thanks

 Richard Gillilan
MacCHESS
Cornell University
Ithaca, NY

[ccp4bb] xds question

2011-02-08 Thread Simon Kolstoe

Dear ccp4bb,

I am quite a fan of XDS and have just upgraded to the latest version.

Normally, to assess the quality of my data, I look at the tables in  
CORRECT.LP and especially the table SUBSET OF INTENSITY DATA WITH  
SIGNAL/NOISE =  0.0 AS FUNCTION OF RESOLUTION.


However in my latest run I only get a single table for all the data  
i.e. for signal/noise = -3.0. Is there a command I can put in my  
XDS.INP that will give me all the other tables or has the CORRECT.LP  
logfile been altered in the most recent version of XDS?


(FYI my xds.inp obtained from the ESRF last week is copied below)

Thanks,

Simon



!=== File Automaticaly generated by mxCuBE
   !=== X-Ray data collected at: ESRF_ID14-1
   !=== Detector type: ADSC Quantum Q210
   !=== Date: Fri Feb 04 03:39:09 2011
   !=== User comments:

   JOB= ALL !XYCORR INIT COLSPOT IDXREF DEFPIX XPLAN INTEGRATE CORRECT
   !JOB= DEFPIX XPLAN INTEGRATE CORRECT

   DATA_RANGE= 1 190
   SPOT_RANGE= 1 20
   SPOT_RANGE= 1 4
   !SPOT_RANGE= 187 190
   BACKGROUND_RANGE= 1 4

   SECONDS=60
   MINIMUM_NUMBER_OF_PIXELS_IN_A_SPOT= 6
   STRONG_PIXEL= 6.0

   OSCILLATION_RANGE= 1.000
   STARTING_ANGLE= 0.000
   STARTING_FRAME= 1
   X-RAY_WAVELENGTH=  0.93340
   NAME_TEMPLATE_OF_DATA_FRAMES= ./data/sk1_1_???.img

   !STARTING_ANGLES_OF_SPINDLE_ROTATION= 0 180 10
   !TOTAL_SPINDLE_ROTATION_RANGES= 60 180 10

   DETECTOR_DISTANCE= 298.55
   DETECTOR= ADSC   MINIMUM_VALID_PIXEL_VALUE= 1   OVERLOAD= 65000
   ORGX= 1014.79ORGY= 1029.10
   NX=  2048   NY=  2048   QX= 0.10200   QY= 0.10200
   VALUE_RANGE_FOR_TRUSTED_DETECTOR_PIXELS= 7000 3

   DIRECTION_OF_DETECTOR_X-AXIS= 1.0 0.0 0.0
   DIRECTION_OF_DETECTOR_Y-AXIS= 0.0 1.0 0.0
   ROTATION_AXIS= 1.0 0.0 0.0
   INCIDENT_BEAM_DIRECTION= 0.0 0.0 1.0
   FRACTION_OF_POLARIZATION= 0.98
   POLARIZATION_PLANE_NORMAL= 0.0 1.0 0.0
   !== Default value recommended
   !AIR= 0.00026895

   SPACE_GROUP_NUMBER= 0
   UNIT_CELL_CONSTANTS= 0 0 0 0 0 0
   INCLUDE_RESOLUTION_RANGE= 50.0 2.4
   RESOLUTION_SHELLS= 15.0 8.0 4.0 3.0 2.8 2.6 2.5 2.4
   FRIEDEL'S_LAW= FALSE

   !FRIEDEL'S_LAW= TRUE
   TRUSTED_REGION= 0 1.40

   REFINE(INTEGRATE)= BEAM ORIENTATION CELL
   !== Default value recommended
   !DELPHI= 3.000
   MAXIMUM_NUMBER_OF_PROCESSORS= 16
   !MAXIMUM_NUMBER_OF_JOBS= 16


Re: [ccp4bb] xds question

2011-02-08 Thread Tim Gruene
Dear Simon, 

I don't know how to change the output written to the log-file (maybe with the
TEST-card), but I wonder why you don't want to look at all of your data but only
those with I/sigI  0? XDS reports all reflections with I/sigI  -3 for a good
reason.

Cheers, Tim

On Tue, Feb 08, 2011 at 12:17:35PM +, Simon Kolstoe wrote:
 Dear ccp4bb,
 
 I am quite a fan of XDS and have just upgraded to the latest version.
 
 Normally, to assess the quality of my data, I look at the tables in
 CORRECT.LP and especially the table SUBSET OF INTENSITY DATA WITH
 SIGNAL/NOISE =  0.0 AS FUNCTION OF RESOLUTION.
 
 However in my latest run I only get a single table for all the data
 i.e. for signal/noise = -3.0. Is there a command I can put in my
 XDS.INP that will give me all the other tables or has the CORRECT.LP
 logfile been altered in the most recent version of XDS?
 
 (FYI my xds.inp obtained from the ESRF last week is copied below)
 
 Thanks,
 
 Simon
 
 
 
 !=== File Automaticaly generated by mxCuBE
!=== X-Ray data collected at: ESRF_ID14-1
!=== Detector type: ADSC Quantum Q210
!=== Date: Fri Feb 04 03:39:09 2011
!=== User comments:
 
JOB= ALL !XYCORR INIT COLSPOT IDXREF DEFPIX XPLAN INTEGRATE CORRECT
!JOB= DEFPIX XPLAN INTEGRATE CORRECT
 
DATA_RANGE= 1 190
SPOT_RANGE= 1 20
SPOT_RANGE= 1 4
!SPOT_RANGE= 187 190
BACKGROUND_RANGE= 1 4
 
SECONDS=60
MINIMUM_NUMBER_OF_PIXELS_IN_A_SPOT= 6
STRONG_PIXEL= 6.0
 
OSCILLATION_RANGE= 1.000
STARTING_ANGLE= 0.000
STARTING_FRAME= 1
X-RAY_WAVELENGTH=  0.93340
NAME_TEMPLATE_OF_DATA_FRAMES= ./data/sk1_1_???.img
 
!STARTING_ANGLES_OF_SPINDLE_ROTATION= 0 180 10
!TOTAL_SPINDLE_ROTATION_RANGES= 60 180 10
 
DETECTOR_DISTANCE= 298.55
DETECTOR= ADSC   MINIMUM_VALID_PIXEL_VALUE= 1   OVERLOAD= 65000
ORGX= 1014.79ORGY= 1029.10
NX=  2048   NY=  2048   QX= 0.10200   QY= 0.10200
VALUE_RANGE_FOR_TRUSTED_DETECTOR_PIXELS= 7000 3
 
DIRECTION_OF_DETECTOR_X-AXIS= 1.0 0.0 0.0
DIRECTION_OF_DETECTOR_Y-AXIS= 0.0 1.0 0.0
ROTATION_AXIS= 1.0 0.0 0.0
INCIDENT_BEAM_DIRECTION= 0.0 0.0 1.0
FRACTION_OF_POLARIZATION= 0.98
POLARIZATION_PLANE_NORMAL= 0.0 1.0 0.0
!== Default value recommended
!AIR= 0.00026895
 
SPACE_GROUP_NUMBER= 0
UNIT_CELL_CONSTANTS= 0 0 0 0 0 0
INCLUDE_RESOLUTION_RANGE= 50.0 2.4
RESOLUTION_SHELLS= 15.0 8.0 4.0 3.0 2.8 2.6 2.5 2.4
FRIEDEL'S_LAW= FALSE
 
!FRIEDEL'S_LAW= TRUE
TRUSTED_REGION= 0 1.40
 
REFINE(INTEGRATE)= BEAM ORIENTATION CELL
!== Default value recommended
!DELPHI= 3.000
MAXIMUM_NUMBER_OF_PROCESSORS= 16
!MAXIMUM_NUMBER_OF_JOBS= 16

-- 
--
Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

phone: +49 (0)551 39 22149

GPG Key ID = A46BEE1A



signature.asc
Description: Digital signature


[ccp4bb] Crystal Imaging Instruments?

2011-02-08 Thread Chris Morris
Hi,

Which crystal imaging system do you use, if any? Are you satisfied with the 
software available to run it?

This information would help set priorities for future development of xtalPiMS, 
so I would be grateful for replies.

xtalPiMS currently supports Formulatrix instruments, and I am currently adding 
support for ThermoElectron (Rhombix).

regards,
Chris

Chris Morris   
chris.mor...@stfc.ac.uk
Tel: +44 (0)1925 603689  Fax: +44 (0)1925 603634
Mobile: 07921-717915
https://www.pims-lims.org/
Daresbury Lab,  Daresbury,  Warrington,  UK,  WA4 4AD 
 


Re: [ccp4bb] x-ray sensitive dye?

2011-02-08 Thread Bosch, Juergen
Hi Richard,
I assume you want to track down which parts of a crystal you already have 
exposed ?
I think dithionite leads to brown colouring of the protein after/during 
exposure. At least we've observed frequently that some ingredients in our 
crystals turn brown over time and you could clearly see where the beam hit. I 
don't know how applicable this would be for small beamsizes and a couple of 
images, but for whole datasets it would work.
James Holton probably has a build-it-yourself solution to this I expect.

Jürgen

-
Jürgen Bosch
Johns Hopkins Bloomberg School of Public Health
Department of Biochemistry  Molecular Biology
Johns Hopkins Malaria Research Institute
615 North Wolfe Street, W8708
Baltimore, MD 21205
Phone: +1-410-614-4742
Lab:  +1-410-614-4894
Fax:  +1-410-955-2926
http://web.mac.com/bosch_lab/http://web.me.com/bosch_lab/

On Feb 8, 2011, at 6:32 AM, Richard Edward Gillilan wrote:

Does anyone know of a water soluble dye that changes color upon exposure to 
x-rays?

Preferably from clear to a dark color. Must work in the liquid state 
(non-frozen... so color centers are out).
Doesn't matter if it is organic or inorganic.


Thanks

Richard Gillilan
MacCHESS
Cornell University
Ithaca, NY



Re: [ccp4bb] xds question

2011-02-08 Thread Bosch, Juergen
Hi Tim,

could you write a bit more about this cutoff. I've been using the 0 cutoff for 
the longest time as it seemed to be much more stringent to report.

Jürgen

-
Jürgen Bosch
Johns Hopkins Bloomberg School of Public Health
Department of Biochemistry  Molecular Biology
Johns Hopkins Malaria Research Institute
615 North Wolfe Street, W8708
Baltimore, MD 21205
Phone: +1-410-614-4742
Lab:  +1-410-614-4894
Fax:  +1-410-955-2926
http://web.mac.com/bosch_lab/http://web.me.com/bosch_lab/

On Feb 8, 2011, at 7:45 AM, Tim Gruene wrote:

Dear Simon,

I don't know how to change the output written to the log-file (maybe with the
TEST-card), but I wonder why you don't want to look at all of your data but only
those with I/sigI  0? XDS reports all reflections with I/sigI  -3 for a good
reason.

Cheers, Tim

On Tue, Feb 08, 2011 at 12:17:35PM +, Simon Kolstoe wrote:
Dear ccp4bb,

I am quite a fan of XDS and have just upgraded to the latest version.

Normally, to assess the quality of my data, I look at the tables in
CORRECT.LP and especially the table SUBSET OF INTENSITY DATA WITH
SIGNAL/NOISE =  0.0 AS FUNCTION OF RESOLUTION.

However in my latest run I only get a single table for all the data
i.e. for signal/noise = -3.0. Is there a command I can put in my
XDS.INP that will give me all the other tables or has the CORRECT.LP
logfile been altered in the most recent version of XDS?

(FYI my xds.inp obtained from the ESRF last week is copied below)

Thanks,

Simon



!=== File Automaticaly generated by mxCuBE
  !=== X-Ray data collected at: ESRF_ID14-1
  !=== Detector type: ADSC Quantum Q210
  !=== Date: Fri Feb 04 03:39:09 2011
  !=== User comments:

  JOB= ALL !XYCORR INIT COLSPOT IDXREF DEFPIX XPLAN INTEGRATE CORRECT
  !JOB= DEFPIX XPLAN INTEGRATE CORRECT

  DATA_RANGE= 1 190
  SPOT_RANGE= 1 20
  SPOT_RANGE= 1 4
  !SPOT_RANGE= 187 190
  BACKGROUND_RANGE= 1 4

  SECONDS=60
  MINIMUM_NUMBER_OF_PIXELS_IN_A_SPOT= 6
  STRONG_PIXEL= 6.0

  OSCILLATION_RANGE= 1.000
  STARTING_ANGLE= 0.000
  STARTING_FRAME= 1
  X-RAY_WAVELENGTH=  0.93340
  NAME_TEMPLATE_OF_DATA_FRAMES= ./data/sk1_1_???.img

  !STARTING_ANGLES_OF_SPINDLE_ROTATION= 0 180 10
  !TOTAL_SPINDLE_ROTATION_RANGES= 60 180 10

  DETECTOR_DISTANCE= 298.55
  DETECTOR= ADSC   MINIMUM_VALID_PIXEL_VALUE= 1   OVERLOAD= 65000
  ORGX= 1014.79ORGY= 1029.10
  NX=  2048   NY=  2048   QX= 0.10200   QY= 0.10200
  VALUE_RANGE_FOR_TRUSTED_DETECTOR_PIXELS= 7000 3

  DIRECTION_OF_DETECTOR_X-AXIS= 1.0 0.0 0.0
  DIRECTION_OF_DETECTOR_Y-AXIS= 0.0 1.0 0.0
  ROTATION_AXIS= 1.0 0.0 0.0
  INCIDENT_BEAM_DIRECTION= 0.0 0.0 1.0
  FRACTION_OF_POLARIZATION= 0.98
  POLARIZATION_PLANE_NORMAL= 0.0 1.0 0.0
  !== Default value recommended
  !AIR= 0.00026895

  SPACE_GROUP_NUMBER= 0
  UNIT_CELL_CONSTANTS= 0 0 0 0 0 0
  INCLUDE_RESOLUTION_RANGE= 50.0 2.4
  RESOLUTION_SHELLS= 15.0 8.0 4.0 3.0 2.8 2.6 2.5 2.4
  FRIEDEL'S_LAW= FALSE

  !FRIEDEL'S_LAW= TRUE
  TRUSTED_REGION= 0 1.40

  REFINE(INTEGRATE)= BEAM ORIENTATION CELL
  !== Default value recommended
  !DELPHI= 3.000
  MAXIMUM_NUMBER_OF_PROCESSORS= 16
  !MAXIMUM_NUMBER_OF_JOBS= 16

--
--
Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

phone: +49 (0)551 39 22149

GPG Key ID = A46BEE1A




Re: [ccp4bb] xds question

2011-02-08 Thread Tim Gruene
Hello Juergen,

since sigma is always positive, a negative I/sigI means that the measured
intensity is negative, hence I would refer you to the truncate reference
On the treatment of negative intensity observations by S. French and K.
Wilson, Acta Cryst (1978) A34, 517-525.

Why exactly -3 is chosen as a cut-off I cannot say, though, but it's better than
0 as cut-off, if I am not mistaken.

Cheers, Tim


On Tue, Feb 08, 2011 at 08:33:13AM -0500, Bosch, Juergen wrote:
 Hi Tim,
 
 could you write a bit more about this cutoff. I've been using the 0 cutoff 
 for the longest time as it seemed to be much more stringent to report.
 
 Jürgen
 
 -
 Jürgen Bosch
 Johns Hopkins Bloomberg School of Public Health
 Department of Biochemistry  Molecular Biology
 Johns Hopkins Malaria Research Institute
 615 North Wolfe Street, W8708
 Baltimore, MD 21205
 Phone: +1-410-614-4742
 Lab:  +1-410-614-4894
 Fax:  +1-410-955-2926
 http://web.mac.com/bosch_lab/http://web.me.com/bosch_lab/
 
 On Feb 8, 2011, at 7:45 AM, Tim Gruene wrote:
 
 Dear Simon,
 
 I don't know how to change the output written to the log-file (maybe with the
 TEST-card), but I wonder why you don't want to look at all of your data but 
 only
 those with I/sigI  0? XDS reports all reflections with I/sigI  -3 for a good
 reason.
 
 Cheers, Tim
 
 On Tue, Feb 08, 2011 at 12:17:35PM +, Simon Kolstoe wrote:
 Dear ccp4bb,
 
 I am quite a fan of XDS and have just upgraded to the latest version.
 
 Normally, to assess the quality of my data, I look at the tables in
 CORRECT.LP and especially the table SUBSET OF INTENSITY DATA WITH
 SIGNAL/NOISE =  0.0 AS FUNCTION OF RESOLUTION.
 
 However in my latest run I only get a single table for all the data
 i.e. for signal/noise = -3.0. Is there a command I can put in my
 XDS.INP that will give me all the other tables or has the CORRECT.LP
 logfile been altered in the most recent version of XDS?
 
 (FYI my xds.inp obtained from the ESRF last week is copied below)
 
 Thanks,
 
 Simon
 
 
 
 !=== File Automaticaly generated by mxCuBE
   !=== X-Ray data collected at: ESRF_ID14-1
   !=== Detector type: ADSC Quantum Q210
   !=== Date: Fri Feb 04 03:39:09 2011
   !=== User comments:
 
   JOB= ALL !XYCORR INIT COLSPOT IDXREF DEFPIX XPLAN INTEGRATE CORRECT
   !JOB= DEFPIX XPLAN INTEGRATE CORRECT
 
   DATA_RANGE= 1 190
   SPOT_RANGE= 1 20
   SPOT_RANGE= 1 4
   !SPOT_RANGE= 187 190
   BACKGROUND_RANGE= 1 4
 
   SECONDS=60
   MINIMUM_NUMBER_OF_PIXELS_IN_A_SPOT= 6
   STRONG_PIXEL= 6.0
 
   OSCILLATION_RANGE= 1.000
   STARTING_ANGLE= 0.000
   STARTING_FRAME= 1
   X-RAY_WAVELENGTH=  0.93340
   NAME_TEMPLATE_OF_DATA_FRAMES= ./data/sk1_1_???.img
 
   !STARTING_ANGLES_OF_SPINDLE_ROTATION= 0 180 10
   !TOTAL_SPINDLE_ROTATION_RANGES= 60 180 10
 
   DETECTOR_DISTANCE= 298.55
   DETECTOR= ADSC   MINIMUM_VALID_PIXEL_VALUE= 1   OVERLOAD= 65000
   ORGX= 1014.79ORGY= 1029.10
   NX=  2048   NY=  2048   QX= 0.10200   QY= 0.10200
   VALUE_RANGE_FOR_TRUSTED_DETECTOR_PIXELS= 7000 3
 
   DIRECTION_OF_DETECTOR_X-AXIS= 1.0 0.0 0.0
   DIRECTION_OF_DETECTOR_Y-AXIS= 0.0 1.0 0.0
   ROTATION_AXIS= 1.0 0.0 0.0
   INCIDENT_BEAM_DIRECTION= 0.0 0.0 1.0
   FRACTION_OF_POLARIZATION= 0.98
   POLARIZATION_PLANE_NORMAL= 0.0 1.0 0.0
   !== Default value recommended
   !AIR= 0.00026895
 
   SPACE_GROUP_NUMBER= 0
   UNIT_CELL_CONSTANTS= 0 0 0 0 0 0
   INCLUDE_RESOLUTION_RANGE= 50.0 2.4
   RESOLUTION_SHELLS= 15.0 8.0 4.0 3.0 2.8 2.6 2.5 2.4
   FRIEDEL'S_LAW= FALSE
 
   !FRIEDEL'S_LAW= TRUE
   TRUSTED_REGION= 0 1.40
 
   REFINE(INTEGRATE)= BEAM ORIENTATION CELL
   !== Default value recommended
   !DELPHI= 3.000
   MAXIMUM_NUMBER_OF_PROCESSORS= 16
   !MAXIMUM_NUMBER_OF_JOBS= 16
 
 --
 --
 Tim Gruene
 Institut fuer anorganische Chemie
 Tammannstr. 4
 D-37077 Goettingen
 
 phone: +49 (0)551 39 22149
 
 GPG Key ID = A46BEE1A
 
 

-- 
--
Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

phone: +49 (0)551 39 22149

GPG Key ID = A46BEE1A



signature.asc
Description: Digital signature


Re: [ccp4bb] Crystal Imaging Instruments?

2011-02-08 Thread Annie Hassell
Chris--

We use the Formulatrix Rock Imager  are quite satisfied with it.

Thanks!
Annie 



-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Chris 
Morris
Sent: Tuesday, February 08, 2011 8:21 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: [ccp4bb] Crystal Imaging Instruments?

Hi,

Which crystal imaging system do you use, if any? Are you satisfied with the 
software available to run it?

This information would help set priorities for future development of xtalPiMS, 
so I would be grateful for replies.

xtalPiMS currently supports Formulatrix instruments, and I am currently adding 
support for ThermoElectron (Rhombix).

regards,
Chris

Chris Morris   
chris.mor...@stfc.ac.uk
Tel: +44 (0)1925 603689  Fax: +44 (0)1925 603634
Mobile: 07921-717915
https://www.pims-lims.org/
Daresbury Lab,  Daresbury,  Warrington,  UK,  WA4 4AD 
 


Re: [ccp4bb] xds question

2011-02-08 Thread Robert Immormino
Hi,
I've pasted below the reasons from Dan Gewirth and the HKL2000 manual
authors for having a -3 sigma cutoff... I'll add briefly that if you
assume the weak data has a Gaussian distribution around zero a -3
sigma cutoff allows you to record ~99.8% of the data.
-bob


SIGMA CUTOFF

Cutoff for rejecting measurements on input. Default = -3.0. Be very
careful if you increase this.

What is the rationale for using sigma cutoff -3.0 in SCALEPACK?
Wouldn't you want to reject all negative intensities? Why shouldn't
you use a sigma cutoff 1.0 or zero? The answer to these questions is
as follows: The best estimate of I may be negative, due to background
subtraction and background fluctuation. Negative measurements
typically represent random fluctuations in the detector's response to
an X-ray signal. If a measurement is highly negative (= -3[[sigma]])
than it may be more likely the result of a mistake, rather than just
random fluctuation.

If one eliminates negative fluctuations, but not the positive ones
before averaging, the result will be highly biased. In SCALEPACK,
sigma cutoff is applied before averaging. If one rejects all negative
intensities before averaging a number of things would happen:

   1.  The averaged intensity would always be positive;
   2.  For totally random data with redundancy 8, in a shell where
there was no signal, , there would be on average 4 positive
measurements, with average intensity one sigma. This is because the
negative measurements had been thrown out. So the average of the four
remaining measurements would be about 2 sigma! This would look like a
resolution shell with a meaningful signal;
   3.  R-merge would be always less than the R-merge with negative
measurements included;
   4.  A SIGMA CUTOFF of 1 would improve R-merge even more, by
excluding even more valid measurements.

Why should this worry you? Exclusion of valid measurements will
deteriorate the final data set. One may notice an inverse relationship
between R-merge and data quality as a function of sigma cutoff. So
much for using R-merge as any criterion of success.

Even the best (averaged) estimate of intensity may be negative. How to
use negative I estimates in subsequent phasing and refinement steps is
a separate story. The author of SCALEPACK suggests the following:

   1. You should never convert I into F.
   2. You should square Fcalc and compare it to I. Most, but not all
of the crystallography programs do not do this. That is life. In the
absence of the proper treatment one can do approximations. One of them
is provided by French and also by French and Wilson. An implementation
of their ideas is in the CCP4 program TRUNCATE. A very simplified and
somewhat imprecise implementation of TRUNCATE is this:

if I  [[sigma]](I), F=sqrt(I)

if I  [[sigma]](I), F=sqrt([[sigma]](I))
format  SIGMA CUTOFF value
default -3
example SIGMA CUTOFF -2.5

referenced from:
http://www.hkl-xray.com/hkl_web1/hkl/Scalepack_Keywords.html


Re: [ccp4bb] coot command cannot be located in phenix

2011-02-08 Thread Pavel Afonine
Hi,

make sure you have Coot installed. Coot is not part of Phenix. This may be
useful:

http://www.phenix-online.org/download/other.html
http://www.phenix-online.org/documentation/coot.htm

Pavel.

On Mon, Feb 7, 2011 at 4:52 PM, LISA science...@gmail.com wrote:

 Hi all,
 I installed phenix1.7 on my mac osx 10.6. But I cannot open coot in this
 phenix.When I press coot, it said coot command cannot be located. How to
 fix this problem? Thank.
 Lisa



Re: [ccp4bb] xds question

2011-02-08 Thread Petri Kursula
Hi,
oh, I'm also surprised people seem to use something else than '-3' as the 
cutoff, i.e. are throwing away data. This, obviously, brings into new light all 
the discussions (which I definitely don't wish to restart) on the 'cutoff 
values' in R(sym) and I/sI which you use to determine the 'resolution 
limit'...and gives one more thing for referees to think about/require when 
looking at Table 1. I am sure most of them, and the readers, take it for 
granted that no data were thrown out before calculating those numbers...and 
sure, the effects of actually using those data might occasionally be more 
severe than a drop of, say, 1% in the apparent overall R(sym) or an increase in 
I/sI.
Petri

On Feb 8, 2011, at 3:07 PM, Robert Immormino wrote:

 Hi,
 I've pasted below the reasons from Dan Gewirth and the HKL2000 manual
 authors for having a -3 sigma cutoff... I'll add briefly that if you
 assume the weak data has a Gaussian distribution around zero a -3
 sigma cutoff allows you to record ~99.8% of the data.
 -bob
 
 
 SIGMA CUTOFF
 
 Cutoff for rejecting measurements on input. Default = -3.0. Be very
 careful if you increase this.
 
 What is the rationale for using sigma cutoff -3.0 in SCALEPACK?
 Wouldn't you want to reject all negative intensities? Why shouldn't
 you use a sigma cutoff 1.0 or zero? The answer to these questions is
 as follows: The best estimate of I may be negative, due to background
 subtraction and background fluctuation. Negative measurements
 typically represent random fluctuations in the detector's response to
 an X-ray signal. If a measurement is highly negative (= -3[[sigma]])
 than it may be more likely the result of a mistake, rather than just
 random fluctuation.
 
 If one eliminates negative fluctuations, but not the positive ones
 before averaging, the result will be highly biased. In SCALEPACK,
 sigma cutoff is applied before averaging. If one rejects all negative
 intensities before averaging a number of things would happen:
 
   1.  The averaged intensity would always be positive;
   2.  For totally random data with redundancy 8, in a shell where
 there was no signal, , there would be on average 4 positive
 measurements, with average intensity one sigma. This is because the
 negative measurements had been thrown out. So the average of the four
 remaining measurements would be about 2 sigma! This would look like a
 resolution shell with a meaningful signal;
   3.  R-merge would be always less than the R-merge with negative
 measurements included;
   4.  A SIGMA CUTOFF of 1 would improve R-merge even more, by
 excluding even more valid measurements.
 
 Why should this worry you? Exclusion of valid measurements will
 deteriorate the final data set. One may notice an inverse relationship
 between R-merge and data quality as a function of sigma cutoff. So
 much for using R-merge as any criterion of success.
 
 Even the best (averaged) estimate of intensity may be negative. How to
 use negative I estimates in subsequent phasing and refinement steps is
 a separate story. The author of SCALEPACK suggests the following:
 
   1. You should never convert I into F.
   2. You should square Fcalc and compare it to I. Most, but not all
 of the crystallography programs do not do this. That is life. In the
 absence of the proper treatment one can do approximations. One of them
 is provided by French and also by French and Wilson. An implementation
 of their ideas is in the CCP4 program TRUNCATE. A very simplified and
 somewhat imprecise implementation of TRUNCATE is this:
 
 if I  [[sigma]](I), F=sqrt(I)
 
 if I  [[sigma]](I), F=sqrt([[sigma]](I))
 formatSIGMA CUTOFF value
 default   -3
 example   SIGMA CUTOFF -2.5
 
 referenced from:
 http://www.hkl-xray.com/hkl_web1/hkl/Scalepack_Keywords.html


---
Petri Kursula, PhD
Group Leader and Docent of Neurobiochemistry (University of Oulu, Finland)
Visiting Scientist (CSSB-HZI, DESY, Hamburg, Germany)
www.biochem.oulu.fi/kursula
www.desy.de/~petri
petri.kurs...@oulu.fi
petri.kurs...@desy.de
---



Re: [ccp4bb] Let's talk pseudotranslational symmetry (or maybe it's bad data).

2011-02-08 Thread Phil Evans
As a general principle, I would always run MR searches in all 8 P2x2x2x space 
groups just to get some controls, if you haven't done it already. Trivial to do 
in Phaser

Phil

On 8 Feb 2011, at 17:49, Francis E Reyes wrote:

 Hi all
 
 I have a case of a dataset that indexed, integrated, and scaled well in P 21 
 21 21 (55.6410   81.6493  147.1294   90.   90.   90.) . The data 
 has an Mn(i/sd) of 2.1 at 3.5 A with a Rpim of about 0.398 at the highest 
 resolution shell (3.49-3.58).
 
 Analysis with phenix.xtriage warns of pseudotranslational symmetry (26% of 
 origin).
 
 
  x  y  zheight   p-value(height)
 ( 0.500, 0.000, 0.233 ) :   26.344   (2.681e-03)
 ( 0.000, 0.338, 0.000 ) :5.380   (8.476e-01)
 
 If the observed pseudo translationals are crystallographic
 the following spacegroups and unit cells are possible:
 
 space groupoperator unit cell of reference setting
 C 2 2 21 (b-1/4,c-1/4,2*a)   x+1/2, y, z+1/4  (73.64, 55.47, 81.46,  
 90.00, 90.00, 90.00)
 
 From what I've read about pseudo c-centering via pseudotranslational 
 symmetry, the problem exhibits itself with alternating weak and strong 
 reflections at low resolution, but become consistent at high resolution. 
 Inspection of the h+k parity groups via truncate does not show this behavior .
 
 Despite the fact the data was collected at the anomalous peak, I do not 
 observe any anomalous signal (DelAnom correlation between half-sets is 0.013 
 for all data).
 
 Using a reasonably complete model (80%) I searched for two molecules in the 
 ASU in space group P 21 21 21 and obtained a solution at TFZ=22.1 for two 
 molecules related solely by a translation.  However the electron density maps 
 (after rigid body refinement) are not great (or maybe my expectations are too 
 high). I am encouraged by the fact the density is weak for a region of the 
 model which should have a different conformation, while strong density is 
 maintained for the rest of the molecule.
 
 Is this the proper way to approach pseudotranslation (i.e. is there any 
 reason to believe that the solution obtained by MR is not the correct 
 solution?).
 
 Is the space group determined? (i.e. does the pseudo c-centering affect 
 pointless's ability to analyze the systematic absences?).
 
 Is the lack of a pattern of alternating weak/strong reflections normal (would 
 observing this behavior be dependent on the crystal orientation) ?
 
 any advice would be greatly appreciated! (especially from those who have had 
 a case like this before)
 
 
 F
 
 
 -
 Francis E. Reyes M.Sc.
 215 UCB
 University of Colorado at Boulder
 
 gpg --keyserver pgp.mit.edu --recv-keys 67BA8D5D
 
 8AE2 F2F4 90F7 9640 28BC  686F 78FD 6669 67BA 8D5D


[ccp4bb] 1st Annual CLS Mx Data Collection School

2011-02-08 Thread Shaun Labiuk
The Canadian Macromolecular Crystallography Facility (CMCF) is pleased to 
introduce an intensive 5-day hands-on data collection school at the Canadian 
Light Source (CLS) synchrotron. The School will take place May 16 - 20, 2011. 
Participants will attend a series of lectures and be actively engaged in 
macromolecular crystallography (Mx) data collection at CMCF beamlines. 
Completing the school will be an essential step to making use of the beamlines 
remotely and will better equip participants to effectively collect diffraction 
data on-site. Additionally, this year’s special topic will be an in-depth look 
at the use of PHENIX for data analysis and structure solution with invited 
speaker Dr. Paul Adams. Participants should have a basic grounding in 
crystallography prior to attending the course. Application deadline is March 1, 
2011. Please visit the CMCF website for more information and application form 
at http://cmcf.lightsource.ca/school


[ccp4bb] Post-Doctoral position available at Queen's University, Kingston, Ontario, Canada

2011-02-08 Thread John Allingham

Subject:
Post-Doctoral position available at Queen's University, Kingston,  
Ontario, Canada


Project Description: A postdoctoral position is available immediately  
for structure-function studies of cytoskeletal motor proteins in human  
and agricultural fungal pathogens in the Department of Biochemistry at  
Queen's University. Projects investigating the structural basis of  
secondary metabolite biosynthesis and biofilm generation in these  
organisms are also available. For these studies, combined approaches  
in molecular biology, biochemistry, biophysics and X-ray  
crystallography are employed. The laboratory is fully equipped with  
all the required infrastructure and in-house X-ray crystallography  
instrumentation. We also have full access to several synchrotron  
beamlines in APS, CHESS, and NSLS.


Qualifications:
The position requires a Ph.D. degree with experience in protein  
purification, crystallization and successful structure determination.


How to Apply:
Interested candidates should send a CV and names and contact  
information for 3 references by email to:


John S. Allingham (Email: allin...@queensu.ca)
Assistant Professor
Tier 2 Canada Research Chair in Structural Biology
Department of Biochemistry
Botterell Hall
Queen's University
18 Stuart St., Rm 641
Kingston, ON
K7L 3N6 Canada

Phone: (613) 533-3137
Fax: (613) 533-2022



For more information, please visit http://meds.queensu.ca/biochem/biochemistry_faculty?id=36 


Re: [ccp4bb] quality homology model

2011-02-08 Thread Oliv Eidam

Hi Susy,

Before going into details: you have to get the alignment right. After 
that, you may look into packing quality and geometry of your models.


Regarding the parameters you are interested in: I talked to a guy from 
the Sali lab (Modeller) about quality assessment of homology models. He 
recommends to compare the gromos or anolea profiles per residue (or 
whatever packing score your software uses) to the template. In case 
there is gap: pay attention to the alignment! Regarding cut-off values 
(what's ok and what not) you may want to look into published work on 
your modeling software and/or contact the authors directly.


Another way to look at model quality, and maybe more significant, is to 
check geometry (clashes, rotamers, bond angles etc.). You could use 
MolProbity for that. The supplementary material of the following article 
contains a good example (Table S1) on the evaluation of a trimer interface:

http://www.ncbi.nlm.nih.gov/pubmed/20457942

Good luck!

  Oliv


On 02/07/11 01:13, Susy Rao wrote:

Dear Community,

I have a question regarding protein model quality for introduction of 
point mutations which should increase solibity/stability of my protein 
of interest.


I plan to do homology models so that I can check the most promising 
amino acids.


Which quality/resolution of the model do I need?
Which parameters would you check (gromos, anolea), and what would you 
use as cut-off values?


At the moment I thought, that it could be interesting to model a 
homologue of my protein and check the RMSDs with the crystal structure.


Maybe you know some good literature, describing this?

Thank you

Susy



--
Oliv Eidam, Ph.D.
Postdoctoral fellow

University of California, San Francisco
Dept. of Pharmaceutical Chemistry
1700 4th Street, Byers Hall North, Room 501
San Francisco, CA 94158 - Box 2550

Phone: 415-514-4253
Fax  : 415-514-4260
Email: eid...@blur.compbio.ucsf.edu





[ccp4bb] N-terminal sequencing

2011-02-08 Thread Junyu Xiao

Hi,

Sorry for non-crystallography related questions. I am seeking protein  
N-terminal sequencing service. However, the facilities I have worked  
with previously (Michigan state and UCSD) were both closed. Does  
anyone know any companies or core facilities that can do this?


Thanks,
Junyu

---
Junyu Xiao, Ph.D.
University of California, San Diego
Leichtag Room 283
9500 Gilman Drive, 0721
La Jolla, CA 92093-0721
Lab phone: 858-822-0684




Re: [ccp4bb] xds question

2011-02-08 Thread Kay Diederichs

Hi Simon,

I've put my answer into the XDSwiki article FAQ 
http://strucbio.biologie.uni-konstanz.de/xdswiki/index.php/FAQ#why_do_the_latest_XDS.2FXSCALE_versions_only_give_a_single_table.2C_with_I.2Fsigma.3E.3D-3_cutoff.3F 
because I was asked this privately a couple of times, after this was 
changed for the May 2010 version.


In short, the table describes the data that are written out by 
XDS/XSCALE, and giving more than one table obscures this fact and tends 
to confuse users (which table should I use? - the -3 sigma cutoff 
table - then why are there the others?).


Others have already mentioned in this thread that SCALEPACK uses the 
same cutoff of -3 sigma. I'm not sure about the cutoff in MOSFLM/SCALA.


I do realize now that some people have been using the tables for 
deciding on a suitable resolution cutoff. With a bit of scripting, this 
can be overcome - pls ask me by email.


hope that helps,

Kay

Am 08.02.2011 20:58, schrieb Simon Kolstoe:

Dear ccp4bb,

I am quite a fan of XDS and have just upgraded to the latest version.

Normally, to assess the quality of my data, I look at the tables in
CORRECT.LP and especially the table SUBSET OF INTENSITY DATA WITH
SIGNAL/NOISE = 0.0 AS FUNCTION OF RESOLUTION.

However in my latest run I only get a single table for all the data
i.e. for signal/noise = -3.0. Is there a command I can put in my
XDS.INP that will give me all the other tables or has the CORRECT.LP
logfile been altered in the most recent version of XDS?

(FYI my xds.inp obtained from the ESRF last week is copied below)

Thanks,

Simon



Re: [ccp4bb] N-terminal sequencing

2011-02-08 Thread Edward A. Berry

A colleague here has had good results with the facility at Iowa state:
  http://www.biotech.iastate.edu/service_facilities/protein.html

If you just want to identify the protein, mass spec may be cheaper.

The same place will do that, or we have had excellent results with
  http://www.appliedbiomics.com/

If you need to confirm N-term construct, Edman degradation is probably still more cost 
effective.




Junyu Xiao wrote:

Hi,

Sorry for non-crystallography related questions. I am seeking protein
N-terminal sequencing service. However, the facilities I have worked
with previously (Michigan state and UCSD) were both closed. Does anyone
know any companies or core facilities that can do this?

Thanks,
Junyu

---
Junyu Xiao, Ph.D.
University of California, San Diego
Leichtag Room 283
9500 Gilman Drive, 0721
La Jolla, CA 92093-0721
Lab phone: 858-822-0684




Re: [ccp4bb] xds question

2011-02-08 Thread Bryan Lepore
On Tue, Feb 8, 2011 at 7:17 AM, Simon Kolstoe s.kols...@ucl.ac.uk wrote:
 XDS [...] signal/noise = -3.0.

i'd be interested to know if there is an equivalent in scala...
perhaps 'REJECT 6 ALL -8'

-Bryan


Re: [ccp4bb] N-terminal sequencing

2011-02-08 Thread Daniel Bonsor
We have used Alphalyse (http://www.alphalyse.com/picknpost.html). 

Dan


Re: [ccp4bb] xds question

2011-02-08 Thread Phil Evans
No cutoffs in Scala
Phil

On 8 Feb 2011, at 20:13, Kay Diederichs wrote:

 Hi Simon,
 
 I've put my answer into the XDSwiki article FAQ 
 http://strucbio.biologie.uni-konstanz.de/xdswiki/index.php/FAQ#why_do_the_latest_XDS.2FXSCALE_versions_only_give_a_single_table.2C_with_I.2Fsigma.3E.3D-3_cutoff.3F
  because I was asked this privately a couple of times, after this was changed 
 for the May 2010 version.
 
 In short, the table describes the data that are written out by XDS/XSCALE, 
 and giving more than one table obscures this fact and tends to confuse users 
 (which table should I use? - the -3 sigma cutoff table - then why are 
 there the others?).
 
 Others have already mentioned in this thread that SCALEPACK uses the same 
 cutoff of -3 sigma. I'm not sure about the cutoff in MOSFLM/SCALA.
 
 I do realize now that some people have been using the tables for deciding on 
 a suitable resolution cutoff. With a bit of scripting, this can be overcome - 
 pls ask me by email.
 
 hope that helps,
 
 Kay
 
 Am 08.02.2011 20:58, schrieb Simon Kolstoe:
 Dear ccp4bb,
 
 I am quite a fan of XDS and have just upgraded to the latest version.
 
 Normally, to assess the quality of my data, I look at the tables in
 CORRECT.LP and especially the table SUBSET OF INTENSITY DATA WITH
 SIGNAL/NOISE = 0.0 AS FUNCTION OF RESOLUTION.
 
 However in my latest run I only get a single table for all the data
 i.e. for signal/noise = -3.0. Is there a command I can put in my
 XDS.INP that will give me all the other tables or has the CORRECT.LP
 logfile been altered in the most recent version of XDS?
 
 (FYI my xds.inp obtained from the ESRF last week is copied below)
 
 Thanks,
 
 Simon
 


[ccp4bb] Postdoctoral position at the University of Washington - Seattle

2011-02-08 Thread Ethan Merritt
A postdoctoral position is available in the group of Ethan Merritt
to work on the structure-based design and development of drugs
targeting disease caused by eukaryotic parasites.  This project
combines crystal structure determination done in the Merritt Lab
with synthetic chemistry and biological assays against the target
parasites carried out by collaborating UW groups. The current focus
of the project is to further develop a set of lead compounds that we
have determined to have nanomolar activity against Toxoplasma and
Cryptosporidium.  Our goal is to modify these lead compounds to
optimize their efficacy against the target parasites while
maintaining their non-toxicity in humans.  We hope over the next
several years to bring these compounds to the point of initial
clinical trials.  Crystallographic work will include structure
determination of homologous proteins from related parasites, which 
may allow us to expand the project to target additional diseases.
Two recent papers from this project provide additional information:
  Ojo et al., Nature Structural  Molecular Biology 17:602 (2010)
  Murphy et al., ACS Med. Chem. Letters 1:331 (2010)

The Merritt Lab is part of the Biomolecular Structure Center (BMSC)
at the University of Washington School of Medicine. The BMSC provides
a collaborative environment that brings together crystallographers,
computational chemists, and synthetic chemists.  We also collaborate
closely with UW groups in the departments of Chemistry, Global Health,
and Tropical Medicine.

Email inquiries and applications to merr...@u.washington.edu.
Please enclose a CV, a summary of research interests and previous
crystallographic experience, and contact information for three people
who might provide a reference letter or recommendation.


-- 
Ethan A Merritt
Department of Biochemistry
Biomolecular Structure Center, Mailstop 357742
University of Washington, Seattle 98195, USA


[ccp4bb] Detaching crystals from glass cover slides

2011-02-08 Thread Wataru Kagawa
Hi all,

I have crystals growing by the hanging-drop method, using 24-well VDX plates 
and Hampton Research siliconized glass cover slides. Most crystals are attached 
to the cover slide, and I am having difficulties detaching the crystals (using 
a cryoloop) without breaking them. There are few smaller crystals floating in 
the drop, and they diffract X-rays pretty well (clean spots, ~3.5A resolution 
using RAXIS). However, I would like to try the bigger ones, because they may 
diffract to a higher resolution.

Any suggestions for detaching crystals from cover slides will be greatly 
appreciated.

Wataru


Re: [ccp4bb] Detaching crystals from glass cover slides

2011-02-08 Thread William G. Scott
Hi Wataru:

I hope all is well.  For the ones you already have grown, try very gently 
prying them off with a wedge-shaped needle.

If you can grow more, try using a very thin smooth layer of vacuum grease, and 
apply the drop to that.  I managed to get RNA crystals to grow that way that 
otherwise irreversibly adhered to the surface.

All the best,

Bill


On Feb 8, 2011, at 6:06 PM, Wataru Kagawa wrote:

 Hi all,
 
 I have crystals growing by the hanging-drop method, using 24-well VDX plates 
 and Hampton Research siliconized glass cover slides. Most crystals are 
 attached to the cover slide, and I am having difficulties detaching the 
 crystals (using a cryoloop) without breaking them. There are few smaller 
 crystals floating in the drop, and they diffract X-rays pretty well (clean 
 spots, ~3.5A resolution using RAXIS). However, I would like to try the bigger 
 ones, because they may diffract to a higher resolution.
 
 Any suggestions for detaching crystals from cover slides will be greatly 
 appreciated.
 
 Wataru


Re: [ccp4bb] Detaching crystals from glass cover slides

2011-02-08 Thread Thirumananseri Kumarevel
Dear Wataru-san:
I understand the problem and sometimes I also faced the same difficulties.
One likely suggestion to pick up the crystal is here:
1. Hold the cover-glass in hand (the drop should be facing bottom side as in
the hanging drop setup)
2. Adjust the microscope to see the crystals (bottomside) and decide which
one you are going to pick-up (keep in mind that you are going to use
cryo-loop, so you need enough space between the cover glass and bottom of
the microscope) [please look at the attachements for quick understanding]
3. Use the cryoloop from the bottom of the coverglass and pick the crystals.
moving the crystals might be easier...
all the best.
With regards,
Kumarevel

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@jiscmail.ac.uk] On Behalf Of Wataru
Kagawa
Sent: Wednesday, February 09, 2011 11:06 AM
To: CCP4BB@jiscmail.ac.uk
Subject: [ccp4bb] Detaching crystals from glass cover slides

Hi all,

I have crystals growing by the hanging-drop method, using 24-well VDX plates
and Hampton Research siliconized glass cover slides. Most crystals are
attached to the cover slide, and I am having difficulties detaching the
crystals (using a cryoloop) without breaking them. There are few smaller
crystals floating in the drop, and they diffract X-rays pretty well (clean
spots, ~3.5A resolution using RAXIS). However, I would like to try the
bigger ones, because they may diffract to a higher resolution.

Any suggestions for detaching crystals from cover slides will be greatly
appreciated.

Wataru


cryoloop.ppt
Description: MS-Powerpoint presentation


Re: [ccp4bb] Detaching crystals from glass cover slides

2011-02-08 Thread Bosch, Juergen
Have you tried seeding ?
And another suggestion, if the small ones already diffract to 3.5 Å on a home 
source, why don't you try to go to a synchrotron ?
Big is also not always better, they might freeze worse than your small ones or 
might have growth artifacts etc.

If you take several huge crystals and touch them gently with e.g. a hair from a 
Unicorn* then let it sit and recover you might get lucky.

Jürgen

* can be replaced by any stronger hair of your preference e.g. pot-bellied pig, 
horse, cats anything that's a bit more sturdy or simply the Microtools from HR
-
Jürgen Bosch
Johns Hopkins Bloomberg School of Public Health
Department of Biochemistry  Molecular Biology
Johns Hopkins Malaria Research Institute
615 North Wolfe Street, W8708
Baltimore, MD 21205
Phone: +1-410-614-4742
Lab:  +1-410-614-4894
Fax:  +1-410-955-2926
http://web.mac.com/bosch_lab/http://web.me.com/bosch_lab/

On Feb 8, 2011, at 9:06 PM, Wataru Kagawa wrote:

Hi all,

I have crystals growing by the hanging-drop method, using 24-well VDX plates 
and Hampton Research siliconized glass cover slides. Most crystals are attached 
to the cover slide, and I am having difficulties detaching the crystals (using 
a cryoloop) without breaking them. There are few smaller crystals floating in 
the drop, and they diffract X-rays pretty well (clean spots, ~3.5A resolution 
using RAXIS). However, I would like to try the bigger ones, because they may 
diffract to a higher resolution.

Any suggestions for detaching crystals from cover slides will be greatly 
appreciated.

Wataru



Re: [ccp4bb] quality homology model

2011-02-08 Thread Robbie Joosten
Dear Susy,

You describe two things: making a homology model and designing mutants.

As Oliv pointed out, the first step is getting the alignment right. Even with a 
multiple sequence alignment and knowledge of the template that can be quite 
difficult. You may need to make several models for alternative alignments.
When validating your model, packing is IMO the most useful thing to look at. 
Rosetta-holes is very good at finding poor models. What_check also has very 
good packing analysis and also flags unsatisfied hydrogen bond donors/acceptors 
which may help a lot. There are also some compound scores such as the 
MolProbity score and Q-mean which may help you to find the best model from a 
set.
The quality of the template is also important. Get the best template or use 
more than one. If the sequence identity isn't too low you should also use 
templates from pdb_redo. They helped (a lot?) in the previous CASP.

If you want to analyse single mutants you can try to work from simple 
principles such as minimizing the loss of torsional freedom of the main and 
side chain (entropy) or optimizing the gain of water entropy when the protein 
folds. Also minimising the loss of hydrogen bonds upon folding is quite useful. 
Just don't expect miracles.

Cheers,
Robbie Joosten



Date: Tue, 8 Feb 2011 10:52:09 -0800
From: eid...@blur.compbio.ucsf.edu
Subject: Re: [ccp4bb] quality homology model
To: CCP4BB@JISCMAIL.AC.UK



  



  
  
Hi Susy,



Before going into details: you have to get the alignment right.
After that, you may look into packing quality and geometry of your
models.



Regarding the parameters you are interested in: I talked to a guy
from the Sali lab (Modeller) about quality assessment of homology
models. He recommends to compare the gromos or anolea profiles per
residue (or whatever packing score your software uses) to the
template. In case there is gap: pay attention to the alignment!
Regarding cut-off values (what's ok and what not) you may want to
look into published work on your modeling software and/or contact
the authors directly. 



Another way to look at model quality, and maybe more significant, is
to check geometry (clashes, rotamers, bond angles etc.). You could
use MolProbity for that. The supplementary material of the following
article contains a good example (Table S1) on the evaluation of a
trimer interface:

http://www.ncbi.nlm.nih.gov/pubmed/20457942



Good luck!



  Oliv





On 02/07/11 01:13, Susy Rao wrote:

  
  

  Dear Community,



I have a question regarding protein model quality for
introduction of point mutations which should increase
solibity/stability of my protein of interest.



I plan to do homology models so that I can check the most
promising amino acids.



Which quality/resolution of the model do I need?

Which parameters would you check (gromos, anolea), and what
would you use as cut-off values?



At the moment I thought, that it could be interesting to
model a homologue of my protein and check the RMSDs with the
crystal structure.



Maybe you know some good literature, describing this?



Thank you



Susy

  

  
  




-- 
Oliv Eidam, Ph.D.
Postdoctoral fellow

University of California, San Francisco
Dept. of Pharmaceutical Chemistry
1700 4th Street, Byers Hall North, Room 501
San Francisco, CA 94158 - Box 2550 

Phone: 415-514-4253
Fax  : 415-514-4260
Email: eid...@blur.compbio.ucsf.edu



  

  

Re: [ccp4bb] quality homology model

2011-02-08 Thread Robbie Joosten
Dear Susy,

You describe two things: making a homology model and designing mutants.

As Oliv pointed out, the first step is getting the alignment right. Even with a 
multiple sequence alignment and knowledge of the template that can be quite 
difficult. You may need to make several models for alternative alignments.
When validating your model, packing is IMO the most useful thing to look at. 
Rosetta-holes is very good at finding poor models. What_check also has very 
good packing analysis and also flags unsatisfied hydrogen bond donors/acceptors 
which may help a lot. There are also some compound scores such as the 
MolProbity score and Q-mean which may help you to find the best model from a 
set.
The quality of the template is also important. Get the best template or use 
more than one. If the sequence identity isn't too low you should also use 
templates from pdb_redo. They helped (a lot?) in the previous CASP.

If you want to analyse single mutants you can try to work from simple 
principles such as minimizing the loss of torsional freedom of the main and 
side chain (entropy) or optimizing the gain of water entropy when the protein 
folds. Also minimising the loss of hydrogen bonds upon folding is quite useful. 
Just don't expect miracles.

Cheers,
Robbie Joosten



Date: Tue, 8 Feb 2011 10:52:09 -0800
From: eid...@blur.compbio.ucsf.edu
Subject: Re: [ccp4bb] quality homology model
To: CCP4BB@JISCMAIL.AC.UK



  



  
  
Hi Susy,



Before going into details: you have to get the alignment right.
After that, you may look into packing quality and geometry of your
models.



Regarding the parameters you are interested in: I talked to a guy
from the Sali lab (Modeller) about quality assessment of homology
models. He recommends to compare the gromos or anolea profiles per
residue (or whatever packing score your software uses) to the
template. In case there is gap: pay attention to the alignment!
Regarding cut-off values (what's ok and what not) you may want to
look into published work on your modeling software and/or contact
the authors directly. 



Another way to look at model quality, and maybe more significant, is
to check geometry (clashes, rotamers, bond angles etc.). You could
use MolProbity for that. The supplementary material of the following
article contains a good example (Table S1) on the evaluation of a
trimer interface:

http://www.ncbi.nlm.nih.gov/pubmed/20457942



Good luck!



  Oliv





On 02/07/11 01:13, Susy Rao wrote:

  
  

  Dear Community,



I have a question regarding protein model quality for
introduction of point mutations which should increase
solibity/stability of my protein of interest.



I plan to do homology models so that I can check the most
promising amino acids.



Which quality/resolution of the model do I need?

Which parameters would you check (gromos, anolea), and what
would you use as cut-off values?



At the moment I thought, that it could be interesting to
model a homologue of my protein and check the RMSDs with the
crystal structure.



Maybe you know some good literature, describing this?



Thank you



Susy

  

  
  




-- 
Oliv Eidam, Ph.D.
Postdoctoral fellow

University of California, San Francisco
Dept. of Pharmaceutical Chemistry
1700 4th Street, Byers Hall North, Room 501
San Francisco, CA 94158 - Box 2550 

Phone: 415-514-4253
Fax  : 415-514-4260
Email: eid...@blur.compbio.ucsf.edu



  

  

[ccp4bb] Ken Olsen, Founder of Digital Equipment Corporation, Died Sunday

2011-02-08 Thread Dale Tronrud

   I see in the news that Ken Olsen has died.  Although he was
not a crystallographer I think we should stop for a moment to
remember the profound impact the company that this man founded
had on our field.

   My first experience in a crystallography lab was as an undergraduate
in M. Sundaralingam's lab in Madison Wisconsin.  While I never had
the opportunity to use them, his two diffractometers were controlled
by the ubiquitous PDP-8 computers.  I had more experience with his
main computer, which was either a PDP-11/34 or 35 (Ethan help me out!).
This was connected to a Vector General graphics display running software
called UWVG.  Having the least stature in the lab I got the midnight
to 4am time slot for model building.  The computer took about 10
minutes to compute and contour each block of map, covering about
three residues.  While waiting I would crawl under the DECwriter and
nap.  The computer would stop rattling when the map was up and that
would wake me.

   When I joined the Matthews lab in Oregon they had a VAX 11/780.
What an amazing machine!  It had 1 MB of RAM and could run a million
instructions in a second.  It only took 48 hours to run a cycle of
refinement with PROLSQ, that is, if no one else used the computer.
These specs don't sound like much but this computer was really a
revolution for computational crystallography.  That a single lab
could own a computer of such power was unheard of before this.
It wasn't just that the computer had so much RAM (We later got it
up to its max of 4 MB.) but the advent of virtual memory made
program design so much easier.  You could simply define an array
of 100,000 elements and not have to worry about finding where in
memory, mixed in with the operating system, utility programs, and
other users' software, you could find an unused block that big.

   Digital didn't invent virtual memory, but the VAX made it
achievable for regular crystallographers.  Through most of the 1980's
you didn't have to worry about getting your code to run on other
computers - Everyone had access to a VAX.

   In the 1990's DEC came out with the alpha CPU chip which really
broke ground for performance.  These things screamed when in came
to running crystallographic software.  In 1999 the lab bought
several of the 666 MHz models.  It was about four years before
Intel came out with a chip that would match these alphas on my
crystallography benchmark and they had to be clocked at over 2 GHz
to do it.

   Yes, Digital lost out in the competition of the marketplace, and
Ken Olsen was pushed out of the company well before the end.  But
what a ride it was.  What great computers they were and what great
science was done on them!

Dale Tronrud