Re: [ccp4bb] frm2frm

2011-10-26 Thread Pedro M. Matias
You need to register at their site first, I suppose. Have you tried 
to ask them for the software ?


At 06:06 26-10-2011, khuchtumur bumerdene wrote:

Hello,

Does anyone know where I could download frm2frm utility from Bruker? 
Is it even possible to do so?


Industry and Medicine Applied Crystallography
Macromolecular Crystallography Unit
___
Phones : (351-21) 446-9100 Ext. 1669
  (351-21) 446-9669 (direct)
Fax   : (351-21) 441-1277 or 443-3644

email : mat...@itqb.unl.pt

http://www.itqb.unl.pt/research/biological-chemistry/industry-and-medicine-applied-crystallography
http://www.itqb.unl.pt/labs/macromolecular-crystallography-unit

Mailing address :
Instituto de Tecnologia Quimica e Biologica
Apartado 127
2781-901 OEIRAS
Portugal


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Frank von Delft
Since when has the cost of any project been limited by the cost of 
hardware?  Someone has to /implement /this -- and make a career out of 
it;  thunderingly absent from this thread has been the chorus of 
volunteers who will write the grant.

phx


On 25/10/2011 21:10, Herbert J. Bernstein wrote:

To be fair to those concerned about cost, a more conservative estimate
from the NSF RDLM workshop last summer in Princeton is $1,000 to $3,000
per terabyte per year for long term storage allowing for overhead in
moderate-sized institutions such as the PDB.  Larger entities, such
as Google are able to do it for much lower annual costs in the range of
$100 to $300 per terabyte per year.  Indeed, if this becomes a serious
effort, one might wish to consider involving the large storage farm
businesses such as Google and Amazon.  They might be willing to help
support science partially in exchange for eyeballs going to their sites.

Regards,
H. J. Bernstein

At 1:56 PM -0600 10/25/11, James Stroud wrote:

On Oct 24, 2011, at 3:56 PM, James Holton wrote:


The PDB only gets about 8000 depositions per year


Just to put this into dollars. If each dataset is about 17 GB in
size, then that's about 14 TB of storage that needs to come online
every year to store the raw data for every structure. A two second
search reveals that Newegg has a 3GB hitachi for $200. So that's
about $1000 / year of storage for the raw data behind PDB deposits.

James




Re: [ccp4bb] COOT not connected to PHENIX

2011-10-26 Thread Tim Gruene
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Grand - with every python script one has to distribute a specific python
version. More food for my prejudice against python ;-)
Tim

On 10/26/2011 12:15 AM, Paul Emsley wrote:
 FYI, from version 0.7, I will not distribute binaries without python.
 
 Paul.
 
 On 25/10/11 17:02, Yuri wrote:
 I installed the version with python embedded in coot
 And it works!!
 Thanks a lot!!

 On Tue, 25 Oct 2011 11:41:38 -0700, Nathaniel Echols wrote:
 On Tue, Oct 25, 2011 at 11:40 AM, Yuriyuri.pom...@ufl.edu  wrote:
 Now here comes the stupid question...
 How do I fix it?
 Install a different coot version or is it something in my
 architecture?
 Install a different Coot.  If you're downloading from Paul Emsley's
 page, you need a package with python in the file name.  I have no
 idea whether the Linux binaries distributed by CCP4 have Python or
 not
 (the Mac version definitely does).

 -Nat
 

- -- 
- --
Dr Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

GPG Key ID = A46BEE1A

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iD8DBQFOp8XQUxlJ7aRr7hoRAm9tAKCFFKmPIS+W7puqehbnJYuFhhQR5wCfUo+G
ZAeWO1qzOhhW2xXTqLgPMpI=
=53xa
-END PGP SIGNATURE-


Re: [ccp4bb] frm2frm

2011-10-26 Thread Tim Gruene
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Good luck - I registered with Bruker one or two years ago and have not
heard a reply since ...
Tim

On 10/26/2011 09:32 AM, Pedro M. Matias wrote:
 You need to register at their site first, I suppose. Have you tried to
 ask them for the software ?
 
 At 06:06 26-10-2011, khuchtumur bumerdene wrote:
 Hello,

 Does anyone know where I could download frm2frm utility from Bruker?
 Is it even possible to do so?
 
 Industry and Medicine Applied Crystallography
 Macromolecular Crystallography Unit
 ___
 Phones : (351-21) 446-9100 Ext. 1669
   (351-21) 446-9669 (direct)
 Fax   : (351-21) 441-1277 or 443-3644
 
 email : mat...@itqb.unl.pt
 
 http://www.itqb.unl.pt/research/biological-chemistry/industry-and-medicine-applied-crystallography
 
 http://www.itqb.unl.pt/labs/macromolecular-crystallography-unit
 
 Mailing address :
 Instituto de Tecnologia Quimica e Biologica
 Apartado 127
 2781-901 OEIRAS
 Portugal   

- -- 
- --
Dr Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

GPG Key ID = A46BEE1A

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iD8DBQFOp8YTUxlJ7aRr7hoRAofzAJ4oXdDvXvSue+aSHZbeA8M0X1rsOgCgzSdf
BMOH2FVyTRX0uy/+i3+tyFI=
=e2Vg
-END PGP SIGNATURE-


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Graeme Winter
Hi James,

Just to pick up on your point about the Pilatus detectors. Yesterday
in 2 hours of giving a beamline a workout (admittedly with Thaumatin)
we acquired 400 + GB of data*. Now I appreciate that this is not
really routine operation, but it does raise an interesting point - if
you have loaded a sample and centred it, collected test shots and
decided it's not that great, why not collect anyway as it may later
prove to be useful?

Bzzt. 2 minutes or less later you have a full data set, and barely
even time to go get a cup of tea.

This does to some extent move the goalposts, as you can acquire far
more data than you need. You never know, you may learn something
interesting from it - perhaps it has different symmetry or packing?
What it does mean is if we can have a method of tagging this data
there may be massively more opportunity to get also-ran data sets for
methods development types. What it also means however is that the cost
of curating this data is then an order of magnitude higher.

Also moving it around is also rather more painful.

Anyhow, I would try to avoid dismissing the effect that new continuous
readout detectors will have on data rates, from experience it is
pretty substantial.

Cheerio,

Graeme

*by data here what I mean is images, rather than information which
is rather more time consuming to acquire. I would argue you get that
from processing / analysing the data...

On 24 October 2011 22:56, James Holton jmhol...@lbl.gov wrote:
 The Pilatus is fast, but or decades now we have had detectors that can read
 out in ~1s.  This means that you can collect a typical ~100 image dataset in
 a few minutes (if flux is not limiting).  Since there are ~150 beamlines
 currently operating around the world and they are open about 200 days/year,
 we should be collecting ~20,000,000 datasets each year.

 We're not.

 The PDB only gets about 8000 depositions per year, which means either we
 throw away 99.96% of our images, or we don't actually collect images
 anywhere near the ultimate capacity of the equipment we have.  In my
 estimation, both of these play about equal roles, with ~50-fold attrition
 between ultimate data collection capacity and actual collected data, and
 another ~50 fold attrition between collected data sets and published
 structures.

 Personally, I think this means that the time it takes to collect the final
 dataset is not rate-limiting in a typical structural biology
 project/paper.  This does not mean that the dataset is of little value.
  Quite the opposite!  About 3000x more time and energy is expended preparing
 for the final dataset than is spent collecting it, and these efforts require
 experimental feedback.  The trick is figuring out how best to compress the
 data used to solve a structure for archival storage.  Do the previous
 data sets count?  Or should the compression be lossy about such
 historical details?  Does the stuff between the spots matter?  After all,
 h,k,l,F,sigF is really just a form of data compression.  In fact, there is
 no such thing as raw data.  Even raw diffraction images are a
 simplification of the signals that came out of the detector electronics.
  But we round-off and average over a lot of things to remove noise.
  Largely because noise is difficult to compress.  The question of how much
 compression is too much compression depends on which information (aka noise)
 you think could be important in the future.

 When it comes to fine-sliced data, such as that from Pilatus, the main
 reason why it doesn't compress very well is not because of the spots, but
 the background.  It occupies thousands of times more pixels than the spots.
  Yes, there is diffuse scattering information in the background pixels, but
 this kind of data is MUCH smoother than the spot data (by definition), and
 therefore is optimally stored in larger pixels.  Last year, I messed around
 a bit with applying different compression protocols to the spots and the
 background, and found that ~30 fold compression can be easily achieved if
 you apply h264 to the background and store the spots with lossless png
 compression:

 http://bl831.als.lbl.gov/~jamesh/lossy_compression/

 I think these results speak to the relative information content of the
 spots and the pixels between them.  Perhaps at least the online version of
 archived images could be in some sort of lossy-background format?  With the
 real images in some sort of slower storage (like a room full of tapes that
 are available upon request)?  Would 30-fold compression make the storage of
 image data tractable enough for some entity like the PDB to be able to
 afford it?


 I go to a lot of methods meetings, and it pains me to see the most brilliant
 minds in the field starved for interesting data sets.  The problem is that
 it is very easy to get people to send you data that is so bad that it can't
 be solved by any software imaginable (I've got piles of that!).  As a
 developer, what you really need is a right answer 

Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Matthew BOWLER
The archiving of all raw data and subsequently making it public is 
something that the large facilities are currently debating whether to 
do.  Here at the ESRF we store user data for only 6 months (and I 
believe that it is available longer on tape) and we already have trouble 
with capacity.  My personal view is that facilities should take the lead 
on this - for MX we already have a very good archiving system - ISPyB - 
also running at Diamond.  ISPyB stores lots of meta data and jpgs of the 
raw images but not the images themselves but a link to the location of 
the data with an option to download if still available.  My preferred 
option would be to store all academically funded data and then make it 
publicly available after say 2-5 years (this will no doubt spark another 
debate on time limits, special dispensation etc).  What needs to be 
thought about is how to order the data and how to make sure that the 
correct meta data are stored with each data set - this will rely heavily 
on user input at the time of the experiment rather than gathering 
together data sets for depositions much later.  As already mentioned, 
this type of resource could be extremely useful for developers and also 
as a general scientific resource.  Smells like an EU grant to me. 
Cheers, Matt.



On 26/10/2011 10:21, Frank von Delft wrote:
Since when has the cost of any project been limited by the cost of 
hardware?  Someone has to /implement /this -- and make a career out of 
it;  thunderingly absent from this thread has been the chorus of 
volunteers who will write the grant.

phx





--
Matthew Bowler
Structural Biology Group
European Synchrotron Radiation Facility
B.P. 220, 6 rue Jules Horowitz
F-38043 GRENOBLE CEDEX
FRANCE
===
Tel: +33 (0) 4.76.88.29.28
Fax: +33 (0) 4.76.88.29.04

http://go.esrf.eu/MX
http://go.esrf.eu/Bowler
===



Re: [ccp4bb] frm2frm

2011-10-26 Thread Tim Gruene
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Dear list,

I just re-registered at
http://www.brukersupport.com/Login.aspx?ReturnUrl=%2fdefault.aspx, and
supported by a notifying email to  Martin Adam
(martin.a...@bruker-axs.de) the registration succeeded within a couple
of minutes.

Tim

On 10/26/2011 10:34 AM, Tim Gruene wrote:
 Good luck - I registered with Bruker one or two years ago and have not
 heard a reply since ...
 Tim
 
 On 10/26/2011 09:32 AM, Pedro M. Matias wrote:
 You need to register at their site first, I suppose. Have you tried to
 ask them for the software ?
 
 At 06:06 26-10-2011, khuchtumur bumerdene wrote:
 Hello,

 Does anyone know where I could download frm2frm utility from Bruker?
 Is it even possible to do so?
 
 Industry and Medicine Applied Crystallography
 Macromolecular Crystallography Unit
 ___
 Phones : (351-21) 446-9100 Ext. 1669
   (351-21) 446-9669 (direct)
 Fax   : (351-21) 441-1277 or 443-3644
 
 email : mat...@itqb.unl.pt
 
 http://www.itqb.unl.pt/research/biological-chemistry/industry-and-medicine-applied-crystallography
 
 http://www.itqb.unl.pt/labs/macromolecular-crystallography-unit
 
 Mailing address :
 Instituto de Tecnologia Quimica e Biologica
 Apartado 127
 2781-901 OEIRAS
 Portugal   
 

- -- 
- --
Dr Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

GPG Key ID = A46BEE1A

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iD8DBQFOp88PUxlJ7aRr7hoRAnlhAKD+6+9qCP/WClVAQlyeoiQ/x1OgbwCeOUYQ
2YabkSUBjUaUR0xtzbqrbkM=
=jasv
-END PGP SIGNATURE-


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread George M. Sheldrick
This raises an important point. The new continuous readout detectors such as the
Pilatus for beamlines or the Bruker Photon for in-house use enable the crystal 
to 
be rotated at constant velocity, eliminating the mechanical errors associated 
with
'stop and go' data collection. Storing their data in 'frames' is an artifical
construction that is currently required for the established data integration
programs but is in fact throwing away information. Maybe in 10 years time 
'frames' 
will be as obsolete as punched cards!

George

On Wed, Oct 26, 2011 at 09:39:40AM +0100, Graeme Winter wrote:
 Hi James,
 
 Just to pick up on your point about the Pilatus detectors. Yesterday
 in 2 hours of giving a beamline a workout (admittedly with Thaumatin)
 we acquired 400 + GB of data*. Now I appreciate that this is not
 really routine operation, but it does raise an interesting point - if
 you have loaded a sample and centred it, collected test shots and
 decided it's not that great, why not collect anyway as it may later
 prove to be useful?
 
 Bzzt. 2 minutes or less later you have a full data set, and barely
 even time to go get a cup of tea.
 
 This does to some extent move the goalposts, as you can acquire far
 more data than you need. You never know, you may learn something
 interesting from it - perhaps it has different symmetry or packing?
 What it does mean is if we can have a method of tagging this data
 there may be massively more opportunity to get also-ran data sets for
 methods development types. What it also means however is that the cost
 of curating this data is then an order of magnitude higher.
 
 Also moving it around is also rather more painful.
 
 Anyhow, I would try to avoid dismissing the effect that new continuous
 readout detectors will have on data rates, from experience it is
 pretty substantial.
 
 Cheerio,
 
 Graeme
 
 *by data here what I mean is images, rather than information which
 is rather more time consuming to acquire. I would argue you get that
 from processing / analysing the data...
 
 On 24 October 2011 22:56, James Holton jmhol...@lbl.gov wrote:
  The Pilatus is fast, but or decades now we have had detectors that can read
  out in ~1s.  This means that you can collect a typical ~100 image dataset in
  a few minutes (if flux is not limiting).  Since there are ~150 beamlines
  currently operating around the world and they are open about 200 days/year,
  we should be collecting ~20,000,000 datasets each year.
 
  We're not.
 
  The PDB only gets about 8000 depositions per year, which means either we
  throw away 99.96% of our images, or we don't actually collect images
  anywhere near the ultimate capacity of the equipment we have.  In my
  estimation, both of these play about equal roles, with ~50-fold attrition
  between ultimate data collection capacity and actual collected data, and
  another ~50 fold attrition between collected data sets and published
  structures.
 
  Personally, I think this means that the time it takes to collect the final
  dataset is not rate-limiting in a typical structural biology
  project/paper.  This does not mean that the dataset is of little value.
   Quite the opposite!  About 3000x more time and energy is expended preparing
  for the final dataset than is spent collecting it, and these efforts require
  experimental feedback.  The trick is figuring out how best to compress the
  data used to solve a structure for archival storage.  Do the previous
  data sets count?  Or should the compression be lossy about such
  historical details?  Does the stuff between the spots matter?  After all,
  h,k,l,F,sigF is really just a form of data compression.  In fact, there is
  no such thing as raw data.  Even raw diffraction images are a
  simplification of the signals that came out of the detector electronics.
   But we round-off and average over a lot of things to remove noise.
   Largely because noise is difficult to compress.  The question of how much
  compression is too much compression depends on which information (aka noise)
  you think could be important in the future.
 
  When it comes to fine-sliced data, such as that from Pilatus, the main
  reason why it doesn't compress very well is not because of the spots, but
  the background.  It occupies thousands of times more pixels than the spots.
   Yes, there is diffuse scattering information in the background pixels, but
  this kind of data is MUCH smoother than the spot data (by definition), and
  therefore is optimally stored in larger pixels.  Last year, I messed around
  a bit with applying different compression protocols to the spots and the
  background, and found that ~30 fold compression can be easily achieved if
  you apply h264 to the background and store the spots with lossless png
  compression:
 
  http://bl831.als.lbl.gov/~jamesh/lossy_compression/
 
  I think these results speak to the relative information content of the
  spots and the pixels between them.  Perhaps at least the 

Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Loes Kroon-Batenburg

Dear James,

Good analysis! You bring up important points.

On 10/24/11 23:56, James Holton wrote:

The Pilatus is fast, but or decades now we have had detectors that can read
out in ~1s. This means that you can collect a typical ~100 image dataset in a
few minutes (if flux is not limiting). Since there are ~150 beamlines
currently operating around the world and they are open about 200 days/year,
we should be collecting ~20,000,000 datasets each year.

We're not.

The PDB only gets about 8000 depositions per year, which means either we
throw away 99.96% of our images, or we don't actually collect images anywhere
near the ultimate capacity of the equipment we have. In my estimation, both
of these play about equal roles, with ~50-fold attrition between ultimate
data collection capacity and actual collected data, and another ~50 fold
attrition between collected data sets and published structures.


Your estimation says: we collect  1/50 * 20,000,000 = 400,000 data sets of which 
only 8,000 get deposited.
An average Pilatus data set (0.1 degree scan) takes about 4 Gb (compressed, 
without loosing information. EVAL can read those!). Storing the 8,000 data sets, 
as James Stroud mentions, can not be the problem.
It is the 392,000 other data sets that we have to find a home for. That would be 
1568 Tb and would cost 49,000 $/year. This may be a slight overestimation, but 
it shows us the problems we face if we want to store ALL raw data.


Even if we would find a way to store all these data, how would we set up a 
useful data base?  If we store all data by name, date and beamline, we will in 
the end inevitable be drowning is a sea of information. It is very unlikely that 
the very interesting data sets will ever be found and used.
It would be much more useful if every data sets would be annotated by the user 
or beam line scientist. Like: impossible to index, bad data from integration 
step, overlap, diffuse streaks etc. Such information could be part of the 
meta data. This however, takes time and may not fit the eagerness to get results 
from one of the other data sets recorded at the same synchrotron trip.

I am afraid that just throwing data sets in a big pool, will not be very useful.

Loes.
--
__

Dr. Loes Kroon-Batenburg
Dept. of Crystal and Structural Chemistry
Bijvoet Center for Biomolecular Research
Utrecht University
Padualaan 8, 3584 CH Utrecht
The Netherlands

E-mail : l.m.j.kroon-batenb...@uu.nl
phone  : +31-30-2532865
fax: +31-30-2533940
__


[ccp4bb] Program announcement: Nautilus v0.2

2011-10-26 Thread Kevin Cowtan

New version of Nautilus (my nucleic acid building program).

Main changes:
 - Now available for both OSX(x86) and Linux.
 - It no longer corrupts the residue names of any existing non-nucleic 
acid model you feed in.

 - Most cases where the output model clashes with itself have been fixed.
 - I've done some (fairly conservative) optimisations. It's now very 
fast. Very, very fast. Faster than something quite fast moving not 
entirely slowly.


It is available from here:
http://www.ysbl.york.ac.uk/~cowtan/nautilus/nautilus.html

..

'Nautilus' is a program for automatic model building of nucleotide
structures in electron density maps. It will trace a map with no model,
extend an existing model, or add nucleotide chains to an existing
non-nucleotide model.

'nautilus' does not currently perform refinement - you will need to
refine and recycle for further model building yourself. Neither does
it assign sequence - the model is built as ploy-U.

This is an alpha release. It may not work at all. It has only
been tested on synthetic data with simulated errors.
The API will change significantly in subsequent releases.


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread John R Helliwell
Dear Frank,
re 'who will write the grant?'.

This is not as easy as it sounds, would that it were!

There are two possible business plans:-
Option 1. Specifically for MX is the PDB as the first and foremost
candidate to seek such additional funds for full diffraction data
deposition for each future PDB deposiition entry. This business plan
possibility is best answered by PDB/EBI (eg Gerard Kleywegt has
answered this in the negative thus far at the CCP4 January 2010).

Option 2 The Journals that host the publications could add the cost to
the subscriber and/or the author according to their funding model. As
an example and as a start a draft business plan has been written by
one of us [JRH] for IUCr Acta Cryst E; this seemed attractive because
of its simpler 'author pays' financing. This proposed business plan is
now with IUCr Journals to digest and hopefully refine. Initial
indications are that Acta Cryst C would be perceived by IUCr Journals
as a better place to start considering this in detail, as it involves
fewer crystal structures than Acta E and would thus be more
manageable. The overall advantage of the responsibility being with
Journals as we see it is that it encourages such 'archiving of data
with literature' across all crystallography related techniques (single
crystal, SAXS, SANS, Electron crystallography etc) and fields
(Biology, Chemistry, Materials, Condensed Matter Physics etc) ie not
just one technique and field, although obviously biology is dear to
our hearts here in the CCP4bb.

Yours sincerely,
John and Tom
John Helliwell  and Tom Terwilliger

On Wed, Oct 26, 2011 at 9:21 AM, Frank von Delft
frank.vonde...@sgc.ox.ac.uk wrote:
 Since when has the cost of any project been limited by the cost of
 hardware?  Someone has to implement this -- and make a career out of it;
 thunderingly absent from this thread has been the chorus of volunteers who
 will write the grant.
 phx


 On 25/10/2011 21:10, Herbert J. Bernstein wrote:

 To be fair to those concerned about cost, a more conservative estimate
 from the NSF RDLM workshop last summer in Princeton is $1,000 to $3,000
 per terabyte per year for long term storage allowing for overhead in
 moderate-sized institutions such as the PDB.  Larger entities, such
 as Google are able to do it for much lower annual costs in the range of
 $100 to $300 per terabyte per year.  Indeed, if this becomes a serious
 effort, one might wish to consider involving the large storage farm
 businesses such as Google and Amazon.  They might be willing to help
 support science partially in exchange for eyeballs going to their sites.

 Regards,
H. J. Bernstein

 At 1:56 PM -0600 10/25/11, James Stroud wrote:

 On Oct 24, 2011, at 3:56 PM, James Holton wrote:

 The PDB only gets about 8000 depositions per year

 Just to put this into dollars. If each dataset is about 17 GB in
 size, then that's about 14 TB of storage that needs to come online
 every year to store the raw data for every structure. A two second
 search reveals that Newegg has a 3GB hitachi for $200. So that's
 about $1000 / year of storage for the raw data behind PDB deposits.

 James





-- 
Professor John R Helliwell DSc


Re: [ccp4bb] COOT not connected to PHENIX

2011-10-26 Thread Ed Pozharski
On Wed, 2011-10-26 at 10:33 +0200, Tim Gruene wrote:
 with every python script one has to distribute a specific python
 version 

... and with every program one has to distribute binaries for every
platform... more food for my prejudice against software ;-)

This really is not about python, it's about distributing with or without
dependencies.  And you are absolutely right about that: for example,
ccp4-6.2.0 comes with python2.6.7 embedded, and, if one goes with
defaults and downloads coot with it, python2.6 in coot's lib folder.
Same with phenix - you get python2.7 with it and python2.4 with
pymol0.99 that comes with it.  By the way, I already have another pymol
that I compiled myself (1.4) and the one from ubuntu repositories (1.2).
Except for the latter, each carries its own copy of whichever python it
needs.  Every single python avatar takes 50-100Mb of space, which is
fortunately not in short supply.

This is why the right way to distribute *nix software is to distribute
software itself and ask the end-user to get all the dependencies (not
that hard these days).  It is fully understood, of course, that people
that do this for living find it more troublesome to deal with me whining
about how their software is screwing up my matplotlib than to just give
me another python copy. What's an extra 50Mb between friends ;-)

Cheers,

Ed.

-- 
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs


[ccp4bb] Question about drawing disulfide bonds in Pymol

2011-10-26 Thread Ke, Jiyuan
Dear All,

I have a question regarding making disulfide bonds in Pymol. The overall 
structure is in cartoon representation. When I display the disulfide bond 
residues as stick models, the cysteine residues sitting on the beta strand have 
a gap from the actual beta-strand. If I check off the flat sheet option, all 
the beta-strands look very wavy, but the cysteines are actually on the beta 
strands. I want to just change the two beta-strands with cysteine residues on 
it as wavy strands and keep other beta-strands flat looking. Does anyone know a 
way to do this or has better ways to draw individual residues on beta-strands. 
Thanks!
jiyuan


Jiyuan Ke, Ph.D.
Research Scientist
Van Andel Research Institute
333 Bostwick Ave NE
Grand Rapids, MI 49503


The information transmitted is intended only for the person or entity to which 
it is addressed and may contain confidential and/or privileged material. Any 
review, retransmission, dissemination or other use of, or taking of any action 
in reliance upon, this information by persons or entities other than the 
intended recipient is prohibited. If you received this in error, please contact 
the sender and delete the material from any computer.


Re: [ccp4bb] frm2frm

2011-10-26 Thread Ed Pozharski
This thread may be relevant

http://www.mail-archive.com/ccp4bb@jiscmail.ac.uk/msg18422.html


On Wed, 2011-10-26 at 15:06 +1000, khuchtumur bumerdene wrote:
 Hello,
 
 Does anyone know where I could download frm2frm utility from Bruker?
 Is it even possible to do so?

-- 
Hurry up before we all come back to our senses!
   Julian, King of Lemurs


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Gerard Bricogne
Dear John and colleagues,

 There seem to be a set a centrifugal forces at play within this thread
that are distracting us from a sensible path of concrete action by throwing
decoys in every conceivable direction, e.g.

 * Pilatus detectors spew out such a volume of data that we can't
possibly archive it all - does that mean that because the 5th generation of
Dectris detectors will be able to write one billion images a second and
catch every scattered photon individually, we should not try and archive
more information than is given by the current merged structure factor data?
That seems a complete failure of reasoning to me: there must be a sensible
form of raw data archiving that would stand between those two extremes and
would retain much more information that the current merged data but would
step back from the enormous degree of oversampling of the raw diffraction
pattern that the Pilatus and its successors are capable of.

 * It is all going to cost an awful lot of money, therefore we need a
team of grant writers to raise its hand and volunteer to apply for resources
from one or more funding agencies - there again there is an avoidance of
the feasible by invocation of the impossible. The IUCr Forum already has an
outline of a feasibility study that would cost only a small amount of
joined-up thinking and book-keeping around already stored information, so
let us not use the inaccessibility of federal or EC funding as a scarecrow
to justify not even trying what is proposed there. And the idea that someone
needs to decide to stake his/her career on this undertaking seems totally
overblown.

 Several people have already pointed out that the sets of images that
would need to be archived would be a very small subset of the bulk of
datasets that are being held on the storage systems of synchrotron sources.
What needs to be done, as already described, is to be able to refer to those
few datasets that gave rise to the integrated data against which deposited
structures were refined (or, in some cases, solved by experimental phasing),
to give them special status in terms of making them visible and accessible
on-line at the same time as the pdb entry itself (rather than after the
statutory 2-5 years that would apply to all the rest, probably in a more
off-line form), and to maintain that accessibility for ever, with a link
from the pdb entry and perhaps from the associated publication. It seems
unlikely that this would involve the mobilisation of such large resources as
to require either a human sacrifice (of the poor person whose life would be
staked on this gamble) or writing a grant application, with the indefinite
postponement of action and the loss of motivation this would imply.

 Coming back to the more technical issue of bloated datasets, it is a
scientific problem that must be amenable to rational analysis to decide on a
sensible form of compression of overly-verbose sets of thin-sliced, perhaps
low-exposure images that would already retain a large fraction, if not all,
of the extra information on which we would wish future improved versions of
processing programs to cut their teeth, for a long time to come. This
approach would seem preferable to stoking up irrational fears of not being
able to cope with the most exaggerated predictions of the volumes of data to
archive, and thus doing nothing at all.

 I very much hope that the can do spirit that marked the final
discussions of the DDDWG (Diffraction Data Deposition Working Group) in
Madrid will emerge on top of all the counter-arguments that consist in
moving the goal posts to prove that the initial goal is unreachable.


 With best wishes,
 
  Gerard.

--
On Wed, Oct 26, 2011 at 02:18:25PM +0100, John R Helliwell wrote:
 Dear Frank,
 re 'who will write the grant?'.
 
 This is not as easy as it sounds, would that it were!
 
 There are two possible business plans:-
 Option 1. Specifically for MX is the PDB as the first and foremost
 candidate to seek such additional funds for full diffraction data
 deposition for each future PDB deposiition entry. This business plan
 possibility is best answered by PDB/EBI (eg Gerard Kleywegt has
 answered this in the negative thus far at the CCP4 January 2010).
 
 Option 2 The Journals that host the publications could add the cost to
 the subscriber and/or the author according to their funding model. As
 an example and as a start a draft business plan has been written by
 one of us [JRH] for IUCr Acta Cryst E; this seemed attractive because
 of its simpler 'author pays' financing. This proposed business plan is
 now with IUCr Journals to digest and hopefully refine. Initial
 indications are that Acta Cryst C would be perceived by IUCr Journals
 as a better place to start considering this in detail, as it involves
 fewer crystal structures than Acta E and would thus be more
 manageable. The overall advantage of the responsibility being with
 Journals as we see it is that it 

[ccp4bb] Structural biologist position at Constellation Pharmaceuticals

2011-10-26 Thread Steve Bellon
Scientist Structural Biology
Constellation Pharmaceuticals is a rapidly growing biopharmaceutical company 
dedicated to the development of novel therapeutics in the emerging field of 
Epigenetics.
The structural biology group at Constellation is looking for an enthusiastic 
and collaborative scientist to contribute to our drug discovery efforts.
This position will be part of an expanding structural biology group, with 
responsibilities  including (but not limited to) conducting laboratory 
experiments related to all aspects of crystallography from crystal growth to 
structure solution, analyzing, and interpreting scientific data, collaborating 
with other scientists, and performing computer-assisted structure based drug 
design.
The successful candidate will have a background in chemistry and an in-depth 
understanding of target/ligand interactions.   2-5 years of post-doctoral 
experience is required, and industrial experience (in pharmaceutical or 
biotech) is a plus.  A strong commitment to communicate structural information 
to colleagues in chemistry, enzymology and biology is expected.

Interested Candidates should forward their CV to care...@constellationpharma.com


This email message and any attachments are being sent by Constellation 
Pharmaceuticals, Inc. The information contained in this e-mail message, and any 
attachments, may contain information that is confidential or privileged. If you 
are not the intended recipient, any dissemination or copying of this 
communication is strictly prohibited. If you have received this communication 
in error, please notify the sender immediately by return e-mail and destroy all 
copies of this message and any attachments. Thank you.

For more information about Constellation, please visit us at 
http://www.constellationpharma.com.


Re: [ccp4bb] Question about drawing disulfide bonds in Pymol

2011-10-26 Thread Ed Pozharski
On Wed, 2011-10-26 at 10:15 -0400, Ke, Jiyuan wrote:
 flat sheet option

IIUC, the set command in pymol allows per-selection application, i.e. if
you try this in the command line instead of checking the option in the
menu

set cartoon_flat_sheets, 0, blah

where blah is your selection.

-- 
Oh, suddenly throwing a giraffe into a volcano to make water is crazy?
Julian, King of Lemurs


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Patrick Shaw Stewart
Could you perhaps use the principle of capture storage that is used by
wild-life photographers with high-speed cameras?

The principle is that the movie is written to the same area of memory,
jumping back to the beginning when it is full (this part is not essential,
but it makes the principle clear).  Then, when the photographer takes his
finger off the trigger, the last x seconds is permanently stored.  So you
keep your wits about you, and press the metaphorical store button just *after
*you have got the movie in the can so to speak

Just a thought

Patrick


On Wed, Oct 26, 2011 at 2:18 PM, John R Helliwell jrhelliw...@gmail.comwrote:

 Dear Frank,
 re 'who will write the grant?'.

 This is not as easy as it sounds, would that it were!

 There are two possible business plans:-
 Option 1. Specifically for MX is the PDB as the first and foremost
 candidate to seek such additional funds for full diffraction data
 deposition for each future PDB deposiition entry. This business plan
 possibility is best answered by PDB/EBI (eg Gerard Kleywegt has
 answered this in the negative thus far at the CCP4 January 2010).

 Option 2 The Journals that host the publications could add the cost to
 the subscriber and/or the author according to their funding model. As
 an example and as a start a draft business plan has been written by
 one of us [JRH] for IUCr Acta Cryst E; this seemed attractive because
 of its simpler 'author pays' financing. This proposed business plan is
 now with IUCr Journals to digest and hopefully refine. Initial
 indications are that Acta Cryst C would be perceived by IUCr Journals
 as a better place to start considering this in detail, as it involves
 fewer crystal structures than Acta E and would thus be more
 manageable. The overall advantage of the responsibility being with
 Journals as we see it is that it encourages such 'archiving of data
 with literature' across all crystallography related techniques (single
 crystal, SAXS, SANS, Electron crystallography etc) and fields
 (Biology, Chemistry, Materials, Condensed Matter Physics etc) ie not
 just one technique and field, although obviously biology is dear to
 our hearts here in the CCP4bb.

 Yours sincerely,
 John and Tom
 John Helliwell  and Tom Terwilliger

 On Wed, Oct 26, 2011 at 9:21 AM, Frank von Delft
 frank.vonde...@sgc.ox.ac.uk wrote:
  Since when has the cost of any project been limited by the cost of
  hardware?  Someone has to implement this -- and make a career out of it;
  thunderingly absent from this thread has been the chorus of volunteers
 who
  will write the grant.
  phx
 
 
  On 25/10/2011 21:10, Herbert J. Bernstein wrote:
 
  To be fair to those concerned about cost, a more conservative estimate
  from the NSF RDLM workshop last summer in Princeton is $1,000 to $3,000
  per terabyte per year for long term storage allowing for overhead in
  moderate-sized institutions such as the PDB.  Larger entities, such
  as Google are able to do it for much lower annual costs in the range of
  $100 to $300 per terabyte per year.  Indeed, if this becomes a serious
  effort, one might wish to consider involving the large storage farm
  businesses such as Google and Amazon.  They might be willing to help
  support science partially in exchange for eyeballs going to their sites.
 
  Regards,
 H. J. Bernstein
 
  At 1:56 PM -0600 10/25/11, James Stroud wrote:
 
  On Oct 24, 2011, at 3:56 PM, James Holton wrote:
 
  The PDB only gets about 8000 depositions per year
 
  Just to put this into dollars. If each dataset is about 17 GB in
  size, then that's about 14 TB of storage that needs to come online
  every year to store the raw data for every structure. A two second
  search reveals that Newegg has a 3GB hitachi for $200. So that's
  about $1000 / year of storage for the raw data behind PDB deposits.
 
  James
 
 



 --
 Professor John R Helliwell DSc




-- 
 patr...@douglas.co.ukDouglas Instruments Ltd.
 Douglas House, East Garston, Hungerford, Berkshire, RG17 7HD, UK
 Directors: Peter Baldock, Patrick Shaw Stewart

 http://www.douglas.co.uk
 Tel: 44 (0) 148-864-9090US toll-free 1-877-225-2034
 Regd. England 2177994, VAT Reg. GB 480 7371 36


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Gloria Borgstahl
I just want to jump in to state that I am ALL FOR the notion of
depositing the images that go with the structure factors and the
refined structure.

Through the years, I have been interviewing folks about the strange
satellite diffraction they saw, but ignored,
used the mains that they could integrate and deposited that structure,
does not help me to
justify the existance of modulated protein crystals to reviewers.

But if I could go and retrieve those images, and reanalyze with new methods.
Dream come true.  Reviewers convinced.

On Wed, Oct 26, 2011 at 10:59 AM, Patrick Shaw Stewart
patr...@douglas.co.uk wrote:

 Could you perhaps use the principle of capture storage that is used by
 wild-life photographers with high-speed cameras?
 The principle is that the movie is written to the same area of memory,
 jumping back to the beginning when it is full (this part is not essential,
 but it makes the principle clear).  Then, when the photographer takes his
 finger off the trigger, the last x seconds is permanently stored.  So you
 keep your wits about you, and press the metaphorical store button just
 after you have got the movie in the can so to speak

 Just a thought
 Patrick

 On Wed, Oct 26, 2011 at 2:18 PM, John R Helliwell jrhelliw...@gmail.com
 wrote:

 Dear Frank,
 re 'who will write the grant?'.

 This is not as easy as it sounds, would that it were!

 There are two possible business plans:-
 Option 1. Specifically for MX is the PDB as the first and foremost
 candidate to seek such additional funds for full diffraction data
 deposition for each future PDB deposiition entry. This business plan
 possibility is best answered by PDB/EBI (eg Gerard Kleywegt has
 answered this in the negative thus far at the CCP4 January 2010).

 Option 2 The Journals that host the publications could add the cost to
 the subscriber and/or the author according to their funding model. As
 an example and as a start a draft business plan has been written by
 one of us [JRH] for IUCr Acta Cryst E; this seemed attractive because
 of its simpler 'author pays' financing. This proposed business plan is
 now with IUCr Journals to digest and hopefully refine. Initial
 indications are that Acta Cryst C would be perceived by IUCr Journals
 as a better place to start considering this in detail, as it involves
 fewer crystal structures than Acta E and would thus be more
 manageable. The overall advantage of the responsibility being with
 Journals as we see it is that it encourages such 'archiving of data
 with literature' across all crystallography related techniques (single
 crystal, SAXS, SANS, Electron crystallography etc) and fields
 (Biology, Chemistry, Materials, Condensed Matter Physics etc) ie not
 just one technique and field, although obviously biology is dear to
 our hearts here in the CCP4bb.

 Yours sincerely,
 John and Tom
 John Helliwell  and Tom Terwilliger

 On Wed, Oct 26, 2011 at 9:21 AM, Frank von Delft
 frank.vonde...@sgc.ox.ac.uk wrote:
  Since when has the cost of any project been limited by the cost of
  hardware?  Someone has to implement this -- and make a career out of it;
  thunderingly absent from this thread has been the chorus of volunteers
  who
  will write the grant.
  phx
 
 
  On 25/10/2011 21:10, Herbert J. Bernstein wrote:
 
  To be fair to those concerned about cost, a more conservative estimate
  from the NSF RDLM workshop last summer in Princeton is $1,000 to $3,000
  per terabyte per year for long term storage allowing for overhead in
  moderate-sized institutions such as the PDB.  Larger entities, such
  as Google are able to do it for much lower annual costs in the range of
  $100 to $300 per terabyte per year.  Indeed, if this becomes a serious
  effort, one might wish to consider involving the large storage farm
  businesses such as Google and Amazon.  They might be willing to help
  support science partially in exchange for eyeballs going to their sites.
 
  Regards,
     H. J. Bernstein
 
  At 1:56 PM -0600 10/25/11, James Stroud wrote:
 
  On Oct 24, 2011, at 3:56 PM, James Holton wrote:
 
  The PDB only gets about 8000 depositions per year
 
  Just to put this into dollars. If each dataset is about 17 GB in
  size, then that's about 14 TB of storage that needs to come online
  every year to store the raw data for every structure. A two second
  search reveals that Newegg has a 3GB hitachi for $200. So that's
  about $1000 / year of storage for the raw data behind PDB deposits.
 
  James
 
 



 --
 Professor John R Helliwell DSc



 --
  patr...@douglas.co.uk    Douglas Instruments Ltd.
  Douglas House, East Garston, Hungerford, Berkshire, RG17 7HD, UK
  Directors: Peter Baldock, Patrick Shaw Stewart

  http://www.douglas.co.uk
  Tel: 44 (0) 148-864-9090    US toll-free 1-877-225-2034
  Regd. England 2177994, VAT Reg. GB 480 7371 36




Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Jacob Keller
Is anyone seriously questioning whether we should archive the images
used for published structures? That amount of space is trivial, could
be implemented just as another link in the PDB website, and would be
really helpful in some cases. One person could set it up in a day! You
could just make it a policy: no images, no PDB submission, no
publishing!

Jacob


On Wed, Oct 26, 2011 at 11:15 AM, Gloria Borgstahl gborgst...@gmail.com wrote:
 I just want to jump in to state that I am ALL FOR the notion of
 depositing the images that go with the structure factors and the
 refined structure.

 Through the years, I have been interviewing folks about the strange
 satellite diffraction they saw, but ignored,
 used the mains that they could integrate and deposited that structure,
 does not help me to
 justify the existance of modulated protein crystals to reviewers.

 But if I could go and retrieve those images, and reanalyze with new methods.
 Dream come true.  Reviewers convinced.

 On Wed, Oct 26, 2011 at 10:59 AM, Patrick Shaw Stewart
 patr...@douglas.co.uk wrote:

 Could you perhaps use the principle of capture storage that is used by
 wild-life photographers with high-speed cameras?
 The principle is that the movie is written to the same area of memory,
 jumping back to the beginning when it is full (this part is not essential,
 but it makes the principle clear).  Then, when the photographer takes his
 finger off the trigger, the last x seconds is permanently stored.  So you
 keep your wits about you, and press the metaphorical store button just
 after you have got the movie in the can so to speak

 Just a thought
 Patrick

 On Wed, Oct 26, 2011 at 2:18 PM, John R Helliwell jrhelliw...@gmail.com
 wrote:

 Dear Frank,
 re 'who will write the grant?'.

 This is not as easy as it sounds, would that it were!

 There are two possible business plans:-
 Option 1. Specifically for MX is the PDB as the first and foremost
 candidate to seek such additional funds for full diffraction data
 deposition for each future PDB deposiition entry. This business plan
 possibility is best answered by PDB/EBI (eg Gerard Kleywegt has
 answered this in the negative thus far at the CCP4 January 2010).

 Option 2 The Journals that host the publications could add the cost to
 the subscriber and/or the author according to their funding model. As
 an example and as a start a draft business plan has been written by
 one of us [JRH] for IUCr Acta Cryst E; this seemed attractive because
 of its simpler 'author pays' financing. This proposed business plan is
 now with IUCr Journals to digest and hopefully refine. Initial
 indications are that Acta Cryst C would be perceived by IUCr Journals
 as a better place to start considering this in detail, as it involves
 fewer crystal structures than Acta E and would thus be more
 manageable. The overall advantage of the responsibility being with
 Journals as we see it is that it encourages such 'archiving of data
 with literature' across all crystallography related techniques (single
 crystal, SAXS, SANS, Electron crystallography etc) and fields
 (Biology, Chemistry, Materials, Condensed Matter Physics etc) ie not
 just one technique and field, although obviously biology is dear to
 our hearts here in the CCP4bb.

 Yours sincerely,
 John and Tom
 John Helliwell  and Tom Terwilliger

 On Wed, Oct 26, 2011 at 9:21 AM, Frank von Delft
 frank.vonde...@sgc.ox.ac.uk wrote:
  Since when has the cost of any project been limited by the cost of
  hardware?  Someone has to implement this -- and make a career out of it;
  thunderingly absent from this thread has been the chorus of volunteers
  who
  will write the grant.
  phx
 
 
  On 25/10/2011 21:10, Herbert J. Bernstein wrote:
 
  To be fair to those concerned about cost, a more conservative estimate
  from the NSF RDLM workshop last summer in Princeton is $1,000 to $3,000
  per terabyte per year for long term storage allowing for overhead in
  moderate-sized institutions such as the PDB.  Larger entities, such
  as Google are able to do it for much lower annual costs in the range of
  $100 to $300 per terabyte per year.  Indeed, if this becomes a serious
  effort, one might wish to consider involving the large storage farm
  businesses such as Google and Amazon.  They might be willing to help
  support science partially in exchange for eyeballs going to their sites.
 
  Regards,
     H. J. Bernstein
 
  At 1:56 PM -0600 10/25/11, James Stroud wrote:
 
  On Oct 24, 2011, at 3:56 PM, James Holton wrote:
 
  The PDB only gets about 8000 depositions per year
 
  Just to put this into dollars. If each dataset is about 17 GB in
  size, then that's about 14 TB of storage that needs to come online
  every year to store the raw data for every structure. A two second
  search reveals that Newegg has a 3GB hitachi for $200. So that's
  about $1000 / year of storage for the raw data behind PDB deposits.
 
  James
 
 



 --
 Professor John R Helliwell DSc



 --
  

[ccp4bb] Experimental Postdoctoral Position in High Throughput Small Molecule Ligand Screening

2011-10-26 Thread Forness, Jessica D
Experimental Postdoctoral Position in High Throughput Small Molecule Ligand 
Screening


Outstanding postdoctoral applicants to work jointly with Drs. Julia Kubakek, 
Mark Hay and Jeffrey Skolnick at the Georgia Institute of Technology are sought 
with the following qualifications:

* Extensive experience in enzyme kinetics studies, enzyme purification or other 
aspects of protein biology and enzyme activity. Experience in handling multiple 
protein systems would be a plus.
* A background in high throughput small molecule ligand screening is strongly 
preferred.
* Experience with or a desire to learn computational biology and molecular 
modeling of protein-ligand interactions.
* The ideal candidate is someone who gets satisfaction out of methods 
development and working through large data sets to see broad-scale patterns.


To apply, please email your CV to : skoln...@gatech.edu  


[ccp4bb] cryo protection

2011-10-26 Thread Leonard Thomas

Hi All,

I have run into a very sensitive crystals system when it comes to cryo  
protecting them.  I have run through the usual suspects and trays are  
going to be setup with a cryo protectant as part of crystallization  
cocktail.  The one problem that  seems to be occurring is that the  
crystals crack as soon as they are transfered out of the original  
drop.  I am running out of ideas and really would love some new ones.


Thanks in advance.

Len

Leonard Thomas Ph.D.
Macromolecular Crystallography Laboratory Manager
University of Oklahoma
Department of Chemistry and Biochemistry
Stephenson Life Sciences Research Center
101 Stephenson Parkway
Norman, OK 73019-5251

lmtho...@ou.edu
http://barlywine.chem.ou.edu
Office: (405)325-1126
Lab: (405)325-7571


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Frank von Delft

Cool - we've found our volunteer!!

On 26/10/2011 17:28, Jacob Keller wrote:

Is anyone seriously questioning whether we should archive the images
used for published structures? That amount of space is trivial, could
be implemented just as another link in the PDB website, and would be
really helpful in some cases. One person could set it up in a day! You
could just make it a policy: no images, no PDB submission, no
publishing!

Jacob


On Wed, Oct 26, 2011 at 11:15 AM, Gloria Borgstahlgborgst...@gmail.com  wrote:

I just want to jump in to state that I am ALL FOR the notion of
depositing the images that go with the structure factors and the
refined structure.

Through the years, I have been interviewing folks about the strange
satellite diffraction they saw, but ignored,
used the mains that they could integrate and deposited that structure,
does not help me to
justify the existance of modulated protein crystals to reviewers.

But if I could go and retrieve those images, and reanalyze with new methods.
Dream come true.  Reviewers convinced.

On Wed, Oct 26, 2011 at 10:59 AM, Patrick Shaw Stewart
patr...@douglas.co.uk  wrote:

Could you perhaps use the principle of capture storage that is used by
wild-life photographers with high-speed cameras?
The principle is that the movie is written to the same area of memory,
jumping back to the beginning when it is full (this part is not essential,
but it makes the principle clear).  Then, when the photographer takes his
finger off the trigger, the last x seconds is permanently stored.  So you
keep your wits about you, and press the metaphorical store button just
after you have got the movie in the can so to speak

Just a thought
Patrick

On Wed, Oct 26, 2011 at 2:18 PM, John R Helliwelljrhelliw...@gmail.com
wrote:

Dear Frank,
re 'who will write the grant?'.

This is not as easy as it sounds, would that it were!

There are two possible business plans:-
Option 1. Specifically for MX is the PDB as the first and foremost
candidate to seek such additional funds for full diffraction data
deposition for each future PDB deposiition entry. This business plan
possibility is best answered by PDB/EBI (eg Gerard Kleywegt has
answered this in the negative thus far at the CCP4 January 2010).

Option 2 The Journals that host the publications could add the cost to
the subscriber and/or the author according to their funding model. As
an example and as a start a draft business plan has been written by
one of us [JRH] for IUCr Acta Cryst E; this seemed attractive because
of its simpler 'author pays' financing. This proposed business plan is
now with IUCr Journals to digest and hopefully refine. Initial
indications are that Acta Cryst C would be perceived by IUCr Journals
as a better place to start considering this in detail, as it involves
fewer crystal structures than Acta E and would thus be more
manageable. The overall advantage of the responsibility being with
Journals as we see it is that it encourages such 'archiving of data
with literature' across all crystallography related techniques (single
crystal, SAXS, SANS, Electron crystallography etc) and fields
(Biology, Chemistry, Materials, Condensed Matter Physics etc) ie not
just one technique and field, although obviously biology is dear to
our hearts here in the CCP4bb.

Yours sincerely,
John and Tom
John Helliwell  and Tom Terwilliger

On Wed, Oct 26, 2011 at 9:21 AM, Frank von Delft
frank.vonde...@sgc.ox.ac.uk  wrote:

Since when has the cost of any project been limited by the cost of
hardware?  Someone has to implement this -- and make a career out of it;
thunderingly absent from this thread has been the chorus of volunteers
who
will write the grant.
phx


On 25/10/2011 21:10, Herbert J. Bernstein wrote:

To be fair to those concerned about cost, a more conservative estimate
from the NSF RDLM workshop last summer in Princeton is $1,000 to $3,000
per terabyte per year for long term storage allowing for overhead in
moderate-sized institutions such as the PDB.  Larger entities, such
as Google are able to do it for much lower annual costs in the range of
$100 to $300 per terabyte per year.  Indeed, if this becomes a serious
effort, one might wish to consider involving the large storage farm
businesses such as Google and Amazon.  They might be willing to help
support science partially in exchange for eyeballs going to their sites.

Regards,
H. J. Bernstein

At 1:56 PM -0600 10/25/11, James Stroud wrote:

On Oct 24, 2011, at 3:56 PM, James Holton wrote:

The PDB only gets about 8000 depositions per year

Just to put this into dollars. If each dataset is about 17 GB in
size, then that's about 14 TB of storage that needs to come online
every year to store the raw data for every structure. A two second
search reveals that Newegg has a 3GB hitachi for $200. So that's
about $1000 / year of storage for the raw data behind PDB deposits.

James





--
Professor John R Helliwell DSc



--
  patr...@douglas.co.ukDouglas Instruments 

Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Jacob Keller
Touche! But alas, I have no access to the PDB's server, so...

JPK

On Wed, Oct 26, 2011 at 11:54 AM, Frank von Delft
frank.vonde...@sgc.ox.ac.uk wrote:
 Cool - we've found our volunteer!!

 On 26/10/2011 17:28, Jacob Keller wrote:

 Is anyone seriously questioning whether we should archive the images
 used for published structures? That amount of space is trivial, could
 be implemented just as another link in the PDB website, and would be
 really helpful in some cases. One person could set it up in a day! You
 could just make it a policy: no images, no PDB submission, no
 publishing!

 Jacob


 On Wed, Oct 26, 2011 at 11:15 AM, Gloria Borgstahlgborgst...@gmail.com
  wrote:

 I just want to jump in to state that I am ALL FOR the notion of
 depositing the images that go with the structure factors and the
 refined structure.

 Through the years, I have been interviewing folks about the strange
 satellite diffraction they saw, but ignored,
 used the mains that they could integrate and deposited that structure,
 does not help me to
 justify the existance of modulated protein crystals to reviewers.

 But if I could go and retrieve those images, and reanalyze with new
 methods.
 Dream come true.  Reviewers convinced.

 On Wed, Oct 26, 2011 at 10:59 AM, Patrick Shaw Stewart
 patr...@douglas.co.uk  wrote:

 Could you perhaps use the principle of capture storage that is used by
 wild-life photographers with high-speed cameras?
 The principle is that the movie is written to the same area of memory,
 jumping back to the beginning when it is full (this part is not
 essential,
 but it makes the principle clear).  Then, when the photographer takes
 his
 finger off the trigger, the last x seconds is permanently stored.  So
 you
 keep your wits about you, and press the metaphorical store button just
 after you have got the movie in the can so to speak

 Just a thought
 Patrick

 On Wed, Oct 26, 2011 at 2:18 PM, John R Helliwelljrhelliw...@gmail.com
 wrote:

 Dear Frank,
 re 'who will write the grant?'.

 This is not as easy as it sounds, would that it were!

 There are two possible business plans:-
 Option 1. Specifically for MX is the PDB as the first and foremost
 candidate to seek such additional funds for full diffraction data
 deposition for each future PDB deposiition entry. This business plan
 possibility is best answered by PDB/EBI (eg Gerard Kleywegt has
 answered this in the negative thus far at the CCP4 January 2010).

 Option 2 The Journals that host the publications could add the cost to
 the subscriber and/or the author according to their funding model. As
 an example and as a start a draft business plan has been written by
 one of us [JRH] for IUCr Acta Cryst E; this seemed attractive because
 of its simpler 'author pays' financing. This proposed business plan is
 now with IUCr Journals to digest and hopefully refine. Initial
 indications are that Acta Cryst C would be perceived by IUCr Journals
 as a better place to start considering this in detail, as it involves
 fewer crystal structures than Acta E and would thus be more
 manageable. The overall advantage of the responsibility being with
 Journals as we see it is that it encourages such 'archiving of data
 with literature' across all crystallography related techniques (single
 crystal, SAXS, SANS, Electron crystallography etc) and fields
 (Biology, Chemistry, Materials, Condensed Matter Physics etc) ie not
 just one technique and field, although obviously biology is dear to
 our hearts here in the CCP4bb.

 Yours sincerely,
 John and Tom
 John Helliwell  and Tom Terwilliger

 On Wed, Oct 26, 2011 at 9:21 AM, Frank von Delft
 frank.vonde...@sgc.ox.ac.uk  wrote:

 Since when has the cost of any project been limited by the cost of
 hardware?  Someone has to implement this -- and make a career out of
 it;
 thunderingly absent from this thread has been the chorus of volunteers
 who
 will write the grant.
 phx


 On 25/10/2011 21:10, Herbert J. Bernstein wrote:

 To be fair to those concerned about cost, a more conservative estimate
 from the NSF RDLM workshop last summer in Princeton is $1,000 to
 $3,000
 per terabyte per year for long term storage allowing for overhead in
 moderate-sized institutions such as the PDB.  Larger entities, such
 as Google are able to do it for much lower annual costs in the range
 of
 $100 to $300 per terabyte per year.  Indeed, if this becomes a serious
 effort, one might wish to consider involving the large storage farm
 businesses such as Google and Amazon.  They might be willing to help
 support science partially in exchange for eyeballs going to their
 sites.

 Regards,
    H. J. Bernstein

 At 1:56 PM -0600 10/25/11, James Stroud wrote:

 On Oct 24, 2011, at 3:56 PM, James Holton wrote:

 The PDB only gets about 8000 depositions per year

 Just to put this into dollars. If each dataset is about 17 GB in
 size, then that's about 14 TB of storage that needs to come online
 every year to store the raw data for every 

Re: [ccp4bb] cryo protection

2011-10-26 Thread Mark J van Raaij
you may have thought of this already, but you could try cryoprotection in the 
drop itself.
i.e. slowly adding cryoprotectant to the reservoir, or replacing the reservoir 
bit by bit with solution containing cryoprotectant, and then adding small 
volumes to the side of the drop
- for example, exchanging 20% volume cryo-solution with the reservoir, letting 
equilibrate with the unchanged drop for a few hours, then add 20% volume to the 
drop from the reservoir (i.e. 0.2 ul if the drop is 1 ul), then add another 20% 
of cryo to the reservoir, equilibrate a few hrs, etc. - the idea being to very 
slowly change the drop conditions and minimise risk of cracking.
Of course, you may need patience and many drops of crystals, not just many 
crystals in a few drops, until you find the cryoprotectant where the crystals 
do not crack and still diffract, if you are successful at all...
but if it works, you can just harvest from the equilibrated drop and directly 
flash-cool

Mark J van Raaij
Laboratorio M-4
Dpto de Estructura de Macromoleculas
Centro Nacional de Biotecnologia - CSIC
c/Darwin 3
E-28049 Madrid, Spain
tel. (+34) 91 585 4616
http://www.cnb.csic.es/content/research/macromolecular/mvraaij





On 26 Oct 2011, at 18:46, Leonard Thomas wrote:

 Hi All,
 
 I have run into a very sensitive crystals system when it comes to cryo 
 protecting them.  I have run through the usual suspects and trays are going 
 to be setup with a cryo protectant as part of crystallization cocktail.  The 
 one problem that  seems to be occurring is that the crystals crack as soon as 
 they are transfered out of the original drop.  I am running out of ideas and 
 really would love some new ones.
 
 Thanks in advance.
 
 Len
 
 Leonard Thomas Ph.D.
 Macromolecular Crystallography Laboratory Manager
 University of Oklahoma
 Department of Chemistry and Biochemistry
 Stephenson Life Sciences Research Center
 101 Stephenson Parkway
 Norman, OK 73019-5251
 
 lmtho...@ou.edu
 http://barlywine.chem.ou.edu
 Office: (405)325-1126
 Lab: (405)325-7571


Re: [ccp4bb] cryo protection

2011-10-26 Thread Tim Gruene
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Dear Len,

just to be on the safe side, my list of 'usual suspects' includes
- - glycerol/PEG400
- - LiCl et al at high concentration
- - Butanediol
- - sugars (glucose/ fructose)
- - oil
- - NaMalonate
- - MPD
...

you mention cracking upon transferring the crystal.
- - do you use a pipet for transfer?
- - addition of cryo TO the drop?
- - did you try slow (several minutes - 1hr) / quick addition of cryo
protectant
- - seeding into slightly different conditions/additive screens
- - seeding into cryo conditions
...

How about collecting data at room temperature?

Hope this list contains some new ideas.

Best wishes,
Tim

On 10/26/2011 06:46 PM, Leonard Thomas wrote:
 Hi All,
 
 I have run into a very sensitive crystals system when it comes to cryo
 protecting them.  I have run through the usual suspects and trays are
 going to be setup with a cryo protectant as part of crystallization
 cocktail.  The one problem that  seems to be occurring is that the
 crystals crack as soon as they are transfered out of the original drop. 
 I am running out of ideas and really would love some new ones.
 
 Thanks in advance.
 
 Len
 
 Leonard Thomas Ph.D.
 Macromolecular Crystallography Laboratory Manager
 University of Oklahoma
 Department of Chemistry and Biochemistry
 Stephenson Life Sciences Research Center
 101 Stephenson Parkway
 Norman, OK 73019-5251
 
 lmtho...@ou.edu
 http://barlywine.chem.ou.edu
 Office: (405)325-1126
 Lab: (405)325-7571
 

- -- 
- --
Dr Tim Gruene
Institut fuer anorganische Chemie
Tammannstr. 4
D-37077 Goettingen

GPG Key ID = A46BEE1A

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iD8DBQFOqD5+UxlJ7aRr7hoRAlFlAJ9b4ieJzoX5J6RRce85Si05d/pkFACdEQGR
GAh002nC1bod6VCBolOv3pQ=
=IO/R
-END PGP SIGNATURE-


Re: [ccp4bb] cryo protection

2011-10-26 Thread Roger Rowlett

  
  
Len,

We have run into this problem from time to time, and it is very
frustrating. Here are some things to try, some of which you may have
done already:


  Grow crystals in a small percentage of the cryoprotectant
(e.g., 5-10% glycerol). This often allows crystal transfer into
a cryo drop without cracking. (Almost never works for us,
though.)
  Do your crystal transfers in the cold room. This slows
evaporation markedly, and may prevent crystal cracking. (This
works for us some of the time.)
  Transfer your crystals to gradually higher cryoprotectant
concentrations (e.g., to 15% glycerol, then 30% glycerol).
(Fiddly, and the crystals get handled a lot, but often works.)
  
  Use different cryoprotectants. We almost always have fewer
cracking issues with glucose than glycerol, but YMMV.
  
  Avoid transferring the crystal from the drop at all. Just add
cryoprotectant to the drop. Even better, add cryoprotectant to
the drop gradually, while keeping the drop humidified over well
solution. This is our "No-fail" method (this is usually, but not
always successful):

http://capsicum.colgate.edu/chwiki/tiki-index.php?page=Mounting+Protein+Crystals#No_fail_cryoprotection
  
  We typically use glucose in this method, but in principle you
  could try glycerol, MPD, PEG-400, or sodium formate, etc.
  

Otherwise, you can try to grow out of a cryo condition that doesn't
need extra cryoprotectant (been there done that) or give up and
shoot at room temp in-house.

Cheers,

___
Roger S. Rowlett
Gordon  Dorothy Kline Professor
Department of Chemistry
Colgate University
13 Oak Drive
Hamilton, NY 13346

tel: (315)-228-7245
ofc: (315)-228-7395
fax: (315)-228-7935
email: rrowl...@colgate.edu

On 10/26/2011 12:46 PM, Leonard Thomas wrote:
Hi All,
  
  
  I have run into a very sensitive crystals system when it comes to
  cryo protecting them. I have run through the usual suspects and
  trays are going to be setup with a cryo protectant as part of
  crystallization cocktail. The one problem that seems to be
  occurring is that the crystals crack as soon as they are
  transfered out of the original drop. I am running out of ideas
  and really would love some new ones.
  
  
  Thanks in advance.
  
  
  Len
  
  
  Leonard Thomas Ph.D.
  
  Macromolecular Crystallography Laboratory Manager
  
  University of Oklahoma
  
  Department of Chemistry and Biochemistry
  
  Stephenson Life Sciences Research Center
  
  101 Stephenson Parkway
  
  Norman, OK 73019-5251
  
  
  lmtho...@ou.edu
  
  http://barlywine.chem.ou.edu
  
  Office: (405)325-1126
  
  Lab: (405)325-7571
  

  



Re: [ccp4bb] Question about drawing disulfide bonds in Pymol

2011-10-26 Thread Christian Roth
Hi,

I think it should also work with the cartoon side chain helper option which 
adjust the cartoon slightly to prevent such situations.

Christian

Am Mittwoch 26 Oktober 2011 16:15:55 schrieb Ke, Jiyuan:
 Dear All,
 
 I have a question regarding making disulfide bonds in Pymol. The overall
  structure is in cartoon representation. When I display the disulfide bond
  residues as stick models, the cysteine residues sitting on the beta strand
  have a gap from the actual beta-strand. If I check off the flat sheet
  option, all the beta-strands look very wavy, but the cysteines are
  actually on the beta strands. I want to just change the two beta-strands
  with cysteine residues on it as wavy strands and keep other beta-strands
  flat looking. Does anyone know a way to do this or has better ways to draw
  individual residues on beta-strands. Thanks! jiyuan
 
 
 Jiyuan Ke, Ph.D.
 Research Scientist
 Van Andel Research Institute
 333 Bostwick Ave NE
 Grand Rapids, MI 49503
 
 
 The information transmitted is intended only for the person or entity to
  which it is addressed and may contain confidential and/or privileged
  material. Any review, retransmission, dissemination or other use of, or
  taking of any action in reliance upon, this information by persons or
  entities other than the intended recipient is prohibited. If you received
  this in error, please contact the sender and delete the material from any
  computer.
 


[ccp4bb] cryo protection

2011-10-26 Thread Elspeth Garman
Dear Len
This is a classic sign of osmotic shock. You can try matching the osmotic 
pressure of the mother liquor and the cryoprotectant buffer.
For a protocol see Acta Cryst D (1999) 55, 1649 section 6.4 
Good luck 
Best wishes 
Elspeth

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Leonard 
Thomas
Sent: 26 October 2011 17:46
To: CCP4BB@JISCMAIL.AC.UK
Subject: [ccp4bb] cryo protection

Hi All,

I have run into a very sensitive crystals system when it comes to cryo 
protecting them.  I have run through the usual suspects and trays are going to 
be setup with a cryo protectant as part of crystallization cocktail.  The one 
problem that  seems to be occurring is that the crystals crack as soon as they 
are transfered out of the original drop.  I am running out of ideas and really 
would love some new ones.

Thanks in advance.

Len

Leonard Thomas Ph.D.
Macromolecular Crystallography Laboratory Manager University of Oklahoma 
Department of Chemistry and Biochemistry Stephenson Life Sciences Research 
Center
101 Stephenson Parkway
Norman, OK 73019-5251

lmtho...@ou.edu
http://barlywine.chem.ou.edu
Office: (405)325-1126
Lab: (405)325-7571


Re: [ccp4bb] cryo protection

2011-10-26 Thread Filip Van Petegem
Hello Leonard,

one thing to test is whether transferring your crystals to a drop containing
simply well solution also causes cracking. If yes, then the possibility
exists that the absence of protein in solution is causing the trouble. In
that case, you can transfer the crystals to oil:  you'll be transferring the
solution (with protein) in which the crystal grew as well, and slowly remove
it without adding anything 'different'.  However, if your crystals crack
simply because they are mechanically fragile, then the oil may actually be
worse.

Filip

On Wed, Oct 26, 2011 at 9:46 AM, Leonard Thomas lmtho...@ou.edu wrote:

 Hi All,

 I have run into a very sensitive crystals system when it comes to cryo
 protecting them.  I have run through the usual suspects and trays are going
 to be setup with a cryo protectant as part of crystallization cocktail.  The
 one problem that  seems to be occurring is that the crystals crack as soon
 as they are transfered out of the original drop.  I am running out of ideas
 and really would love some new ones.

 Thanks in advance.

 Len

 Leonard Thomas Ph.D.
 Macromolecular Crystallography Laboratory Manager
 University of Oklahoma
 Department of Chemistry and Biochemistry
 Stephenson Life Sciences Research Center
 101 Stephenson Parkway
 Norman, OK 73019-5251

 lmtho...@ou.edu
 http://barlywine.chem.ou.edu
 Office: (405)325-1126
 Lab: (405)325-7571




-- 
Filip Van Petegem, PhD
Assistant Professor
The University of British Columbia
Dept. of Biochemistry and Molecular Biology
2350 Health Sciences Mall - Rm 2.356
Vancouver, V6T 1Z3

phone: +1 604 827 4267
email: filip.vanpete...@gmail.com
http://crg.ubc.ca/VanPetegem/


Re: [ccp4bb] cryo protection

2011-10-26 Thread Zhou, Tongqing (NIH/VRC) [E]
You can also try to crosslink before transferring to cryo.


From: Filip Van Petegem filip.vanpete...@gmail.com
To: CCP4BB@JISCMAIL.AC.UK CCP4BB@JISCMAIL.AC.UK
Sent: Wed Oct 26 13:19:16 2011
Subject: Re: [ccp4bb] cryo protection

Hello Leonard,

one thing to test is whether transferring your crystals to a drop containing 
simply well solution also causes cracking. If yes, then the possibility exists 
that the absence of protein in solution is causing the trouble. In that case, 
you can transfer the crystals to oil:  you'll be transferring the solution 
(with protein) in which the crystal grew as well, and slowly remove it without 
adding anything 'different'.  However, if your crystals crack simply because 
they are mechanically fragile, then the oil may actually be worse.

Filip

On Wed, Oct 26, 2011 at 9:46 AM, Leonard Thomas 
lmtho...@ou.edumailto:lmtho...@ou.edu wrote:
Hi All,

I have run into a very sensitive crystals system when it comes to cryo 
protecting them.  I have run through the usual suspects and trays are going to 
be setup with a cryo protectant as part of crystallization cocktail.  The one 
problem that  seems to be occurring is that the crystals crack as soon as they 
are transfered out of the original drop.  I am running out of ideas and really 
would love some new ones.

Thanks in advance.

Len

Leonard Thomas Ph.D.
Macromolecular Crystallography Laboratory Manager
University of Oklahoma
Department of Chemistry and Biochemistry
Stephenson Life Sciences Research Center
101 Stephenson Parkway
Norman, OK 73019-5251

lmtho...@ou.edumailto:lmtho...@ou.edu
http://barlywine.chem.ou.edu
Office: (405)325-1126tel:%28405%29325-1126
Lab: (405)325-7571tel:%28405%29325-7571



--
Filip Van Petegem, PhD
Assistant Professor
The University of British Columbia
Dept. of Biochemistry and Molecular Biology
2350 Health Sciences Mall - Rm 2.356
Vancouver, V6T 1Z3

phone: +1 604 827 4267
email: filip.vanpete...@gmail.commailto:filip.vanpete...@gmail.com
http://crg.ubc.ca/VanPetegem/


Re: [ccp4bb] cryo protection

2011-10-26 Thread Muhammed bashir Khan
Hi Len;

I was having exactly the same problem with my crystals, but when we grow
the crystals in presence of increasing concentration of Glycerol and MPD
starting from 0.5 to 10%. The crystal doesn't appear after 3% of Glycerol
or MPD but the one which appear in 2.5 to 3 % were much resistant to
cracking than the original crystals.

 Good luck

Bashir

On Wed, October 26, 2011 18:46, Leonard Thomas wrote:
 Hi All,

 I have run into a very sensitive crystals system when it comes to cryo
 protecting them.  I have run through the usual suspects and trays are
 going to be setup with a cryo protectant as part of crystallization
 cocktail.  The one problem that  seems to be occurring is that the
 crystals crack as soon as they are transfered out of the original
 drop.  I am running out of ideas and really would love some new ones.

 Thanks in advance.

 Len

 Leonard Thomas Ph.D.
 Macromolecular Crystallography Laboratory Manager
 University of Oklahoma
 Department of Chemistry and Biochemistry
 Stephenson Life Sciences Research Center
 101 Stephenson Parkway
 Norman, OK 73019-5251

 lmtho...@ou.edu
 http://barlywine.chem.ou.edu
 Office: (405)325-1126
 Lab: (405)325-7571



-- 
Muhammad Bashir Khan
**
Department for Structural and Computational Biology
Max F. Perutz Laboratories
University of Vienna
Campus Vienna Biocenter 5
A-1030 Vienna
Austria

Austria

Phone: +43(1)427752224
Fax: +43(1)42779522


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Herbert J. Bernstein

Dear Colleagues,

  Gerard strikes a very useful note in pleading for a can-do
approach.  Part of going from can-do to actually-done
is to make realistic estimates of the costs of doing and
then to adjust plans appropriately to do what can be afforded
now and to work towards doing as much of what remains undone
as has sufficient benefit to justify the costs.

  We appear to be in a fortunate situation in which some
portion of the raw data behind a signficant portion of the
studies released in the PDB could probably be retained for some
significant period of time and be made available for further
analysis.  It would seem wise to explore these possibilities
and try to optimize the approaches used -- e.g. to consider
moves towards well documented formats, and retention of critical
metadata with such data to help in future analysis.

  Please do not let the perfect be the enemy of the good.

  Regards,
Herbert

=
 Herbert J. Bernstein, Professor of Computer Science
   Dowling College, Kramer Science Center, KSC 121
Idle Hour Blvd, Oakdale, NY, 11769

 +1-631-244-3035
 y...@dowling.edu
=

On Wed, 26 Oct 2011, Gerard Bricogne wrote:


Dear John and colleagues,

There seem to be a set a centrifugal forces at play within this thread
that are distracting us from a sensible path of concrete action by throwing
decoys in every conceivable direction, e.g.

* Pilatus detectors spew out such a volume of data that we can't
possibly archive it all - does that mean that because the 5th generation of
Dectris detectors will be able to write one billion images a second and
catch every scattered photon individually, we should not try and archive
more information than is given by the current merged structure factor data?
That seems a complete failure of reasoning to me: there must be a sensible
form of raw data archiving that would stand between those two extremes and
would retain much more information that the current merged data but would
step back from the enormous degree of oversampling of the raw diffraction
pattern that the Pilatus and its successors are capable of.

* It is all going to cost an awful lot of money, therefore we need a
team of grant writers to raise its hand and volunteer to apply for resources
from one or more funding agencies - there again there is an avoidance of
the feasible by invocation of the impossible. The IUCr Forum already has an
outline of a feasibility study that would cost only a small amount of
joined-up thinking and book-keeping around already stored information, so
let us not use the inaccessibility of federal or EC funding as a scarecrow
to justify not even trying what is proposed there. And the idea that someone
needs to decide to stake his/her career on this undertaking seems totally
overblown.

Several people have already pointed out that the sets of images that
would need to be archived would be a very small subset of the bulk of
datasets that are being held on the storage systems of synchrotron sources.
What needs to be done, as already described, is to be able to refer to those
few datasets that gave rise to the integrated data against which deposited
structures were refined (or, in some cases, solved by experimental phasing),
to give them special status in terms of making them visible and accessible
on-line at the same time as the pdb entry itself (rather than after the
statutory 2-5 years that would apply to all the rest, probably in a more
off-line form), and to maintain that accessibility for ever, with a link
from the pdb entry and perhaps from the associated publication. It seems
unlikely that this would involve the mobilisation of such large resources as
to require either a human sacrifice (of the poor person whose life would be
staked on this gamble) or writing a grant application, with the indefinite
postponement of action and the loss of motivation this would imply.

Coming back to the more technical issue of bloated datasets, it is a
scientific problem that must be amenable to rational analysis to decide on a
sensible form of compression of overly-verbose sets of thin-sliced, perhaps
low-exposure images that would already retain a large fraction, if not all,
of the extra information on which we would wish future improved versions of
processing programs to cut their teeth, for a long time to come. This
approach would seem preferable to stoking up irrational fears of not being
able to cope with the most exaggerated predictions of the volumes of data to
archive, and thus doing nothing at all.

I very much hope that the can do spirit that marked the final
discussions of the DDDWG (Diffraction Data Deposition Working Group) in
Madrid will emerge on top of all the counter-arguments that consist in
moving the goal posts to prove that the initial goal is unreachable.


With best wishes,

 Gerard.


Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread James Stroud

On Oct 26, 2011, at 9:59 AM, Patrick Shaw Stewart wrote:

 The principle is that the movie is written to the same area of memory, 
 jumping back to the beginning when it is full (this part is not essential, 
 but it makes the principle clear).  Then, when the photographer takes his 
 finger off the trigger, the last x seconds is permanently stored.  So you 
 keep your wits about you, and press the metaphorical store button just 
 after you have got the movie in the can so to speak

This idea seems equivalent to only storing permanently those datasets that 
actually yield structures worthy of deposition.

James



Re: [ccp4bb] cryo protection

2011-10-26 Thread Andrew Purkiss-Trew
Another possibility (other than those already mentioned) is to try  
freezing without a cryoprotectant, by fishing the crystals out onto a  
mesh and removing all the mother liquor.


The following paper has some details:
Direct cryocooling of naked crystals: are cryoprotection agents  
always necessary?


Erika Pellegrini, Dario Pianoa, and Matthew W. Bowlera

Acta Cryst. (2011). D67, 902–906

--
Andrew Purkiss
X-ray Laboratory Manager
Cancer Research UK
London Research Institute.


Quoting Leonard Thomas lmtho...@ou.edu:


Hi All,

I have run into a very sensitive crystals system when it comes to  
cryo protecting them.  I have run through the usual suspects and  
trays are going to be setup with a cryo protectant as part of  
crystallization cocktail.  The one problem that  seems to be  
occurring is that the crystals crack as soon as they are transfered  
out of the original drop.  I am running out of ideas and really  
would love some new ones.


Thanks in advance.

Len

Leonard Thomas Ph.D.
Macromolecular Crystallography Laboratory Manager
University of Oklahoma
Department of Chemistry and Biochemistry
Stephenson Life Sciences Research Center
101 Stephenson Parkway
Norman, OK 73019-5251

lmtho...@ou.edu
http://barlywine.chem.ou.edu
Office: (405)325-1126
Lab: (405)325-7571






This message was sent using IMP, the Internet Messaging Program.


Re: [ccp4bb] cryo protection

2011-10-26 Thread Leonard Thomas
A good number of things to try.  Just a little more info that was  
asked for.  The crystals are grown in Peg 3350 over a range of pH  
values using Bis-Tris Propane.  The are coming out of 2 different salt  
conditions.   My feeling is it is an osmolality problem though I also  
observed cracking when going into a separately made well solution.  I  
will look it trying a number of suggestions given.


Cheers,
Len

Leonard Thomas Ph.D.
Macromolecular Crystallography Laboratory Manager
University of Oklahoma
Department of Chemistry and Biochemistry
Stephenson Life Sciences Research Center
101 Stephenson Parkway
Norman, OK 73019-5251

lmtho...@ou.edu
http://barlywine.chem.ou.edu
Office: (405)325-1126
Lab: (405)325-7571

On Oct 26, 2011, at 12:54 PM, Muhammed bashir Khan wrote:


Hi Len;

I was having exactly the same problem with my crystals, but when we  
grow
the crystals in presence of increasing concentration of Glycerol and  
MPD
starting from 0.5 to 10%. The crystal doesn't appear after 3% of  
Glycerol

or MPD but the one which appear in 2.5 to 3 % were much resistant to
cracking than the original crystals.

Good luck

Bashir

On Wed, October 26, 2011 18:46, Leonard Thomas wrote:

Hi All,

I have run into a very sensitive crystals system when it comes to  
cryo

protecting them.  I have run through the usual suspects and trays are
going to be setup with a cryo protectant as part of crystallization
cocktail.  The one problem that  seems to be occurring is that the
crystals crack as soon as they are transfered out of the original
drop.  I am running out of ideas and really would love some new ones.

Thanks in advance.

Len

Leonard Thomas Ph.D.
Macromolecular Crystallography Laboratory Manager
University of Oklahoma
Department of Chemistry and Biochemistry
Stephenson Life Sciences Research Center
101 Stephenson Parkway
Norman, OK 73019-5251

lmtho...@ou.edu
http://barlywine.chem.ou.edu
Office: (405)325-1126
Lab: (405)325-7571




--
Muhammad Bashir Khan
**
Department for Structural and Computational Biology
Max F. Perutz Laboratories
University of Vienna
Campus Vienna Biocenter 5
A-1030 Vienna
Austria

Austria

Phone: +43(1)427752224
Fax: +43(1)42779522




Re: [ccp4bb] cryo protection

2011-10-26 Thread harkewal singh
Len,
May be you have already done this. I would closely check my
crystallization conditions and also check the pH of the cryo. In some
cases, during cryoprotection the pH of the original drop may drastically
different than the cryo solution. Also, sometime back, we were exploring
different cryoprotectant conditions for sensitive crystal and came
across this -
http://www.xtals.org/crystal_cryo.pdf by Artem. 

HTH
Harkewal


On Wed, 26 Oct 2011 11:46:08 -0500, Leonard Thomas lmtho...@ou.edu
wrote:
 Hi All,
 
 I have run into a very sensitive crystals system when it comes to
 cryo  protecting them.  I have run through the usual suspects and
 trays are  going to be setup with a cryo protectant as part of
 crystallization  cocktail.  The one problem that  seems to be
 occurring is that the  crystals crack as soon as they are transfered
 out of the original  drop.  I am running out of ideas and really would
 love some new ones.
 
 Thanks in advance.
 
 Len
 
 Leonard Thomas Ph.D.
 Macromolecular Crystallography Laboratory Manager
 University of Oklahoma
 Department of Chemistry and Biochemistry
 Stephenson Life Sciences Research Center
 101 Stephenson Parkway
 Norman, OK 73019-5251
 
 lmtho...@ou.edu
 http://barlywine.chem.ou.edu
 Office: (405)325-1126
 Lab: (405)325-7571


Re: [ccp4bb] cryo protection

2011-10-26 Thread Mathews, Irimpan I.
One small point:

Just make sure that you are not too off from the contents of the protein 
solution.  Sometimes protein solution may have a high amount of salt or things 
like that and we forget to include atleast half of this concentration into the 
cryo solution.  This could easily crack the crystals depending on the 
concentration and the type of compounds in it.

Regards,
Mathews


-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Leonard 
Thomas
Sent: Wednesday, October 26, 2011 11:57 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: Re: [ccp4bb] cryo protection

A good number of things to try.  Just a little more info that was  
asked for.  The crystals are grown in Peg 3350 over a range of pH  
values using Bis-Tris Propane.  The are coming out of 2 different salt  
conditions.   My feeling is it is an osmolality problem though I also  
observed cracking when going into a separately made well solution.  I  
will look it trying a number of suggestions given.

Cheers,
Len

Leonard Thomas Ph.D.
Macromolecular Crystallography Laboratory Manager
University of Oklahoma
Department of Chemistry and Biochemistry
Stephenson Life Sciences Research Center
101 Stephenson Parkway
Norman, OK 73019-5251

lmtho...@ou.edu
http://barlywine.chem.ou.edu
Office: (405)325-1126
Lab: (405)325-7571

On Oct 26, 2011, at 12:54 PM, Muhammed bashir Khan wrote:

 Hi Len;

 I was having exactly the same problem with my crystals, but when we  
 grow
 the crystals in presence of increasing concentration of Glycerol and  
 MPD
 starting from 0.5 to 10%. The crystal doesn't appear after 3% of  
 Glycerol
 or MPD but the one which appear in 2.5 to 3 % were much resistant to
 cracking than the original crystals.

 Good luck

 Bashir

 On Wed, October 26, 2011 18:46, Leonard Thomas wrote:
 Hi All,

 I have run into a very sensitive crystals system when it comes to  
 cryo
 protecting them.  I have run through the usual suspects and trays are
 going to be setup with a cryo protectant as part of crystallization
 cocktail.  The one problem that  seems to be occurring is that the
 crystals crack as soon as they are transfered out of the original
 drop.  I am running out of ideas and really would love some new ones.

 Thanks in advance.

 Len

 Leonard Thomas Ph.D.
 Macromolecular Crystallography Laboratory Manager
 University of Oklahoma
 Department of Chemistry and Biochemistry
 Stephenson Life Sciences Research Center
 101 Stephenson Parkway
 Norman, OK 73019-5251

 lmtho...@ou.edu
 http://barlywine.chem.ou.edu
 Office: (405)325-1126
 Lab: (405)325-7571



 -- 
 Muhammad Bashir Khan
 **
 Department for Structural and Computational Biology
 Max F. Perutz Laboratories
 University of Vienna
 Campus Vienna Biocenter 5
 A-1030 Vienna
 Austria

 Austria

 Phone: +43(1)427752224
 Fax: +43(1)42779522




Re: [ccp4bb] cryo protection

2011-10-26 Thread David Schuller
One more thing you could try: high pressure cryo-cooling. Se any of a 
number of paperas by Chae Un Kim; e.g.


http://www.ncbi.nlm.nih.gov/pubmed/17452791

Acta Crystallogr D Biol Crystallogr. 
http://www.ncbi.nlm.nih.gov/pubmed/17452791# 2007 May;63(Pt 5):653-9. 
Epub 2007 Apr 21.





On 10/26/11 12:46, Leonard Thomas wrote:

Hi All,

I have run into a very sensitive crystals system when it comes to cryo 
protecting them.  I have run through the usual suspects and trays are 
going to be setup with a cryo protectant as part of crystallization 
cocktail.  The one problem that  seems to be occurring is that the 
crystals crack as soon as they are transfered out of the original 
drop.  I am running out of ideas and really would love some new ones.


Thanks in advance.

Len

Leonard Thomas Ph.D.
Macromolecular Crystallography Laboratory Manager
University of Oklahoma
Department of Chemistry and Biochemistry
Stephenson Life Sciences Research Center
101 Stephenson Parkway
Norman, OK 73019-5251

lmtho...@ou.edu
http://barlywine.chem.ou.edu
Office: (405)325-1126
Lab: (405)325-7571



--
===
All Things Serve the Beam
===
   David J. Schuller
   modern man in a post-modern world
   MacCHESS, Cornell University
   schul...@cornell.edu



Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Colin Nave
I have been nominated by the IUCr synchrotron commission (thanks colleagues!) 
to represent them for this issue. However, at the moment, this is a personal 
view.

1. For archiving raw diffraction image data for structures in the PDB, it 
should be the responsibility of the worldwide PDB. They are by far the best 
place to do it and as Jacob says the space requirements are trivial. Gerard K's 
negative statement at CCP4-2010 sounds rather ex cathedra (in increasing order 
of influence/power do we have the Pope, US president, the Bond Market and 
finally Gerard K?). Did he make the statement in a formal presentation or in 
the bar? More seriously, I am sure he had good reasons (e.g. PDB priorities) if 
he did make this statement. It would be nice if Gerard could provide some 
explanation.

2. I agree with the can do attitude at Madrid as supported by Gerard B. 
Setting up something as best one can with existing enthusiasts will get the 
ball rolling, provide some immediate benefit and allow subsequent improvements. 

3. Ideally the data to be deposited should include all stages e.g. raw images, 
corrected images, MIR/SAD/MAD images, unmerged integrated intensities, 
scaled, merged etc. Plus the metadata, software  versions used for the various 
stages. Worrying too much about all of this should not of course prevent a 
start being made. (An aside. I put the corrected in quotes because the raw 
images have fewer errors. The subsequent processing for detector distortions 
etc. depend on an imperfect model for the detector. I don't like the phrase 
data correction).

4. Doing this for PDB depositions would then provide a basis for other data 
which did not result in PDB depositions. There seems to be a view that the 
archiving of this should be the responsibility of the synchrotrons which 
generated the data. This should be possible for some synchrotrons (e.g. 
Diamond) where there is pressure in any case from their funders to archive all 
data generated at the facility. However not all synchrotrons will be able to do 
this. There is also the issue of data collected at home sources. Presumably it 
will require a few willing synchrotrons to pioneer this in a coordinated way. 
Hopefully others will then follow. I don't think we can expect the PDB to 
archive the 99.96% of the data which did not result in structures.

5.  My view is that for data in the PDB the same release rules should apply for 
the images as for the other data. For other data, the funders of the research 
might want to define release rules. However, we can make suggestions!

6. Looking to the future, there is FEL data coming along, both single molecule 
and nano-crystals (assuming the FEL delivers for these areas).

7. I agree with Gerard B - as far as I see it, the highest future benefit of 
having archived raw images will result from being able to reprocess datasets 
from samples containing multiple lattices 
My view is that all crystals are, to a greater or lesser extent, subject to 
this. We just might not see it easily as the detector resolution or beam 
divergence is inadequate. Just think we could have several structures (one from 
each lattice) each with less disorder rather than just one average structure.  
Not sure whether Gloria's modulated structures would be as ubiquitous but her 
argument is along the same lines.

Regards 
  Colin

-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Herbert 
J. Bernstein
Sent: 26 October 2011 18:55
To: ccp4bb
Subject: Re: [ccp4bb] IUCr committees, depositing images

Dear Colleagues,

   Gerard strikes a very useful note in pleading for a can-do
approach.  Part of going from can-do to actually-done
is to make realistic estimates of the costs of doing and
then to adjust plans appropriately to do what can be afforded
now and to work towards doing as much of what remains undone
as has sufficient benefit to justify the costs.

   We appear to be in a fortunate situation in which some
portion of the raw data behind a signficant portion of the
studies released in the PDB could probably be retained for some
significant period of time and be made available for further
analysis.  It would seem wise to explore these possibilities
and try to optimize the approaches used -- e.g. to consider
moves towards well documented formats, and retention of critical
metadata with such data to help in future analysis.

   Please do not let the perfect be the enemy of the good.

   Regards,
 Herbert

=
  Herbert J. Bernstein, Professor of Computer Science
Dowling College, Kramer Science Center, KSC 121
 Idle Hour Blvd, Oakdale, NY, 11769

  +1-631-244-3035
  y...@dowling.edu
=

On Wed, 26 Oct 2011, Gerard Bricogne wrote:

 Dear John and colleagues,

 There seem to be a set a centrifugal forces at play within this thread
 

Re: [ccp4bb] cryo protection

2011-10-26 Thread Jim Pflugrath
For some ideas on cryocrystallography, one can watch an online webinar on the 
subject:
http://www.rigaku.com/protein/webinar-001.html

Maybe some unbiased folks can comment? ;)

Jim



From: CCP4 bulletin board [CCP4BB@JISCMAIL.AC.UK] on behalf of Leonard Thomas 
[lmtho...@ou.edu]
Sent: Wednesday, October 26, 2011 11:46 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: [ccp4bb] cryo protection

Hi All,

I have run into a very sensitive crystals system when it comes to cryo
protecting them.  I have run through the usual suspects and trays are
...

[ccp4bb] Postdoctoral Position at King's College London

2011-10-26 Thread Steiner, Roberto
A three-year postdoctoral position in the field of mechanistic enzymology is 
immediately available in the Steiner Laboratory of King’s College London. The 
salary is £33,193 per annum inclusive of London allowance.
The project aims at studying the intriguing biological process of 
cofactor-independent oxygenation catalysis (1). To understand how dioxygen 
chemistry takes place in a cofactor-less manner we will use enzymes for which 
we have recently obtained structural information in various catalytically 
relevant states (2).
The ideal candidate has a strong biochemistry/chemistry background, extensive 
experience in structural biology focused on enzyme mechanisms and an interest 
in molecular dynamics (MD). The MD work will be carried out in collaboration 
with Prof. Ceccarelli of the University of Cagliari, Italy, where the postdoc 
will be able to spend some time to improve his/her skills in MD techniques. The 
postdoc will also be in close contact with the Manchester group of Prof. 
Scrutton where complementary spectroscopic and kinetic studies will be carried 
out.
This project will equip the post-holder with a rare and sought-after skill-set 
as well as a comprehensive overview of an inter-disciplinary study.
To apply for this position go to 
http://www.kcl.ac.uk/depsta/pertra/vacancy/external/pers_detail.php?jobindex=10871.
 Please make sure you quote the reference number G6/JKA/772/11-JT.
Informal enquiries are welcome at 
roberto.stei...@kcl.ac.ukmailto:roberto.stei...@kcl.ac.uk. The closing date 
for this application is 24 Nov. 2011.

(1) Fetzner S. and Steiner RA (2010). Cofactor-independent oxidases and 
oxygenases. Appl. Microbiol. Biotechnol. 86, 791-804.
(2) Steiner RA, Janssen HJ, Roversi P, Oakley OJ, Fetzner S (2010). Structural 
basis for cofactor independent dioxygenation of N-heteroaromatic compounds at 
the •/•-hydrolase fold. Proc. Natl. Acad. Sci. USA, 107, 657-662.

Roberto Steiner, PhD
Group Leader
Randall Division of Cell and Molecular Biophysics
King's College London

Room 3.10A
New Hunt's House
Guy's Campus
SE1 1UL, London, UK
Tel 0044-20-78488216
Fax 0044-20-78486435
roberto.stei...@kcl.ac.ukmailto:roberto.stei...@kcl.ac.uk






Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Martin M. Ripoll
Dear George, dear all,

I was just trying to summarize my point of view regarding this important
issue when I got your e-mail, that reflects exactly my own opinion!

Martin

Dr. Martin Martinez-Ripoll
Research Professor
xmar...@iqfr.csic.es
Department of Crystallography  Structural Biology
www.xtal.iqfr.csic.es
Telf.: +34 917459550
Consejo Superior de Investigaciones Científicas
Spanish National Research Council
www.csic.es



-Mensaje original-
De: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] En nombre de George
M. Sheldrick
Enviado el: miércoles, 26 de octubre de 2011 11:52
Para: CCP4BB@JISCMAIL.AC.UK
Asunto: Re: [ccp4bb] IUCr committees, depositing images

This raises an important point. The new continuous readout detectors such as
the
Pilatus for beamlines or the Bruker Photon for in-house use enable the
crystal to 
be rotated at constant velocity, eliminating the mechanical errors
associated with
'stop and go' data collection. Storing their data in 'frames' is an
artifical
construction that is currently required for the established data integration
programs but is in fact throwing away information. Maybe in 10 years time
'frames' 
will be as obsolete as punched cards!

George

On Wed, Oct 26, 2011 at 09:39:40AM +0100, Graeme Winter wrote:
 Hi James,
 
 Just to pick up on your point about the Pilatus detectors. Yesterday
 in 2 hours of giving a beamline a workout (admittedly with Thaumatin)
 we acquired 400 + GB of data*. Now I appreciate that this is not
 really routine operation, but it does raise an interesting point - if
 you have loaded a sample and centred it, collected test shots and
 decided it's not that great, why not collect anyway as it may later
 prove to be useful?
 
 Bzzt. 2 minutes or less later you have a full data set, and barely
 even time to go get a cup of tea.
 
 This does to some extent move the goalposts, as you can acquire far
 more data than you need. You never know, you may learn something
 interesting from it - perhaps it has different symmetry or packing?
 What it does mean is if we can have a method of tagging this data
 there may be massively more opportunity to get also-ran data sets for
 methods development types. What it also means however is that the cost
 of curating this data is then an order of magnitude higher.
 
 Also moving it around is also rather more painful.
 
 Anyhow, I would try to avoid dismissing the effect that new continuous
 readout detectors will have on data rates, from experience it is
 pretty substantial.
 
 Cheerio,
 
 Graeme
 
 *by data here what I mean is images, rather than information which
 is rather more time consuming to acquire. I would argue you get that
 from processing / analysing the data...
 
 On 24 October 2011 22:56, James Holton jmhol...@lbl.gov wrote:
  The Pilatus is fast, but or decades now we have had detectors that can
read
  out in ~1s.  This means that you can collect a typical ~100 image
dataset in
  a few minutes (if flux is not limiting).  Since there are ~150 beamlines
  currently operating around the world and they are open about 200
days/year,
  we should be collecting ~20,000,000 datasets each year.
 
  We're not.
 
  The PDB only gets about 8000 depositions per year, which means either we
  throw away 99.96% of our images, or we don't actually collect images
  anywhere near the ultimate capacity of the equipment we have.  In my
  estimation, both of these play about equal roles, with ~50-fold
attrition
  between ultimate data collection capacity and actual collected data, and
  another ~50 fold attrition between collected data sets and published
  structures.
 
  Personally, I think this means that the time it takes to collect the
final
  dataset is not rate-limiting in a typical structural biology
  project/paper.  This does not mean that the dataset is of little value.
   Quite the opposite!  About 3000x more time and energy is expended
preparing
  for the final dataset than is spent collecting it, and these efforts
require
  experimental feedback.  The trick is figuring out how best to compress
the
  data used to solve a structure for archival storage.  Do the previous
  data sets count?  Or should the compression be lossy about such
  historical details?  Does the stuff between the spots matter?  After
all,
  h,k,l,F,sigF is really just a form of data compression.  In fact, there
is
  no such thing as raw data.  Even raw diffraction images are a
  simplification of the signals that came out of the detector electronics.
   But we round-off and average over a lot of things to remove noise.
   Largely because noise is difficult to compress.  The question of how
much
  compression is too much compression depends on which information (aka
noise)
  you think could be important in the future.
 
  When it comes to fine-sliced data, such as that from Pilatus, the main
  reason why it doesn't compress very well is not because of the spots,
but
  the background.  It 

Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Gerard Bricogne
Dear Colin,

 Thank you for accepting the heavy burden of responsibility your
colleagues have thrown onto your shoulders ;-) . It is great that you are
entering this discussion, and I am grateful for the support you are bringing
to the notion of starting something at ground level and learning from it,
rather that staying in the realm of conjecture and axiomatics, or entering
the virility contest as to whose beamline will make raw data archiving most
impossible.

 One small point, however, about your statement regarding multiple
lattices, that 

 ...  all crystals are, to a greater or lesser extent, subject to this.
 We just might not see it easily as the detector resolution or beam
 divergence is inadequate. Just think we could have several structures
 (one from each lattice) each with less disorder rather than just one
 average structure.

I am not sure that what you describe in your last sentence is a realistic
prospect, nor that it would in any case constitute the main advantage of
better dealing with multiple lattices. The most important consequence of
their multiplicity is that their spots overlap and corrupt each other's
intensities, so that the main benefit of improved processing would be to
mitigate that mutual corruption, first by correctly flagging overlaps, then
by partially trying to resolve those overlaps internally as much as scaling
procedures will allow (one could call that non-merohedral detwinning - it
is done e.g. by small-molecule softeware), and finally by adapting
refinement protocols to recognise that they may have to refine against
measurements that are a mixture of several intensities, to a degree and
according to a pattern that varies from one observation to another (unlike
regular twinning).

 Currently, if a main lattice can be identified and indexed, one tends
to integrate the spots it successfully indexes, and to abstain from worrying
about the accidental corruption of the resulting intensities by accidental
overlaps with spots of the other lattices (whose existence is promptly
forgotten). It is the undoing of that corruption that would bring the main
benefit, not the fact that one could see several variants of the structure
by fitting the data attached to the various lattices: that would be possible
only if overlaps were negligible. The prospects for improving electron
density maps by reprocessing raw images in the future are therefore
considerable for mainstream structures, not just as a way of perhaps teasing
interestingly different structures from each lattice in infrequent cases.

 I apologise if I have laboured this point, but I am concerned that
every slight slip of the pen that makes the benefits of future reprocessing
look as if they will just contribute to splitting hairs does a disservice to
this crucial discussion (and hence, potentially, to the community) by
belittling the importance and urgency of the task.


 With best wishes,
 
Gerard (B.)

--
On Wed, Oct 26, 2011 at 07:58:51PM +, Colin Nave wrote:
 I have been nominated by the IUCr synchrotron commission (thanks colleagues!) 
 to represent them for this issue. However, at the moment, this is a personal 
 view.
 
 1. For archiving raw diffraction image data for structures in the PDB, it 
 should be the responsibility of the worldwide PDB. They are by far the best 
 place to do it and as Jacob says the space requirements are trivial. Gerard 
 K's negative statement at CCP4-2010 sounds rather ex cathedra (in increasing 
 order of influence/power do we have the Pope, US president, the Bond Market 
 and finally Gerard K?). Did he make the statement in a formal presentation or 
 in the bar? More seriously, I am sure he had good reasons (e.g. PDB 
 priorities) if he did make this statement. It would be nice if Gerard could 
 provide some explanation.
 
 2. I agree with the can do attitude at Madrid as supported by Gerard B. 
 Setting up something as best one can with existing enthusiasts will get the 
 ball rolling, provide some immediate benefit and allow subsequent 
 improvements. 
 
 3. Ideally the data to be deposited should include all stages e.g. raw 
 images, corrected images, MIR/SAD/MAD images, unmerged integrated 
 intensities, scaled, merged etc. Plus the metadata, software  versions used 
 for the various stages. Worrying too much about all of this should not of 
 course prevent a start being made. (An aside. I put the corrected in quotes 
 because the raw images have fewer errors. The subsequent processing for 
 detector distortions etc. depend on an imperfect model for the detector. I 
 don't like the phrase data correction).
 
 4. Doing this for PDB depositions would then provide a basis for other data 
 which did not result in PDB depositions. There seems to be a view that the 
 archiving of this should be the responsibility of the synchrotrons which 
 generated the data. This should be possible for some synchrotrons (e.g. 
 Diamond) where there 

Re: [ccp4bb] COOT not connected to PHENIX

2011-10-26 Thread Jaime Jensen
I experienced this same issue a while ago. When I attempt to reinstall coot 
using Fink (instructions on Bill Scott's coot page), I receive this error in 
Terminal:

WARNING: While resolving dependency nose-py27 for package 
numpy-py27-1.5.1-1, package nose-py27 was not found.
Reading build dependency for numpy-py27-1.5.1-1...
WARNING: While resolving dependency nose-py27 for package 
numpy-py27-1.5.1-1, package nose-py27 was not found.
Can't resolve dependency nose-py27 for package numpy-py27-1.5.1-1 (no
matching packages/versions found)
Exiting with failure.


I had originally downloaded ccp4-6.2.0 (Mac version), and experienced no 
problems until I tried to open coot from PHENIX. Any suggestions on how to 
resolve this issue?


Jaime


From: CCP4 bulletin board [CCP4BB@JISCMAIL.AC.UK] on behalf of Ed Pozharski 
[epozh...@umaryland.edu]
Sent: Wednesday, October 26, 2011 9:24 AM
To: CCP4BB@JISCMAIL.AC.UK
Subject: Re: [ccp4bb] COOT not connected to PHENIX

On Wed, 2011-10-26 at 10:33 +0200, Tim Gruene wrote:
 with every python script one has to distribute a specific python
 version

... and with every program one has to distribute binaries for every
platform... more food for my prejudice against software ;-)

This really is not about python, it's about distributing with or without
dependencies.  And you are absolutely right about that: for example,
ccp4-6.2.0 comes with python2.6.7 embedded, and, if one goes with
defaults and downloads coot with it, python2.6 in coot's lib folder.
Same with phenix - you get python2.7 with it and python2.4 with
pymol0.99 that comes with it.  By the way, I already have another pymol
that I compiled myself (1.4) and the one from ubuntu repositories (1.2).
Except for the latter, each carries its own copy of whichever python it
needs.  Every single python avatar takes 50-100Mb of space, which is
fortunately not in short supply.

This is why the right way to distribute *nix software is to distribute
software itself and ask the end-user to get all the dependencies (not
that hard these days).  It is fully understood, of course, that people
that do this for living find it more troublesome to deal with me whining
about how their software is screwing up my matplotlib than to just give
me another python copy. What's an extra 50Mb between friends ;-)

Cheers,

Ed.

--
I'd jump in myself, if I weren't so good at whistling.
   Julian, King of Lemurs



Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Colin Nave
Dear George, Martin

I don't understand the point that one is throwing away information by storing 
in frames. If the frames have sufficiently fine intervals (given by some 
sampling theorem consideration) I can't see how one loses information. Can one 
of you explain?
Thanks
Colin



-Original Message-
From: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] On Behalf Of Martin M. 
Ripoll
Sent: 26 October 2011 22:50
To: ccp4bb
Subject: Re: [ccp4bb] IUCr committees, depositing images

Dear George, dear all,

I was just trying to summarize my point of view regarding this important
issue when I got your e-mail, that reflects exactly my own opinion!

Martin

Dr. Martin Martinez-Ripoll
Research Professor
xmar...@iqfr.csic.es
Department of Crystallography  Structural Biology
www.xtal.iqfr.csic.es
Telf.: +34 917459550
Consejo Superior de Investigaciones Científicas
Spanish National Research Council
www.csic.es



-Mensaje original-
De: CCP4 bulletin board [mailto:CCP4BB@JISCMAIL.AC.UK] En nombre de George
M. Sheldrick
Enviado el: miércoles, 26 de octubre de 2011 11:52
Para: CCP4BB@JISCMAIL.AC.UK
Asunto: Re: [ccp4bb] IUCr committees, depositing images

This raises an important point. The new continuous readout detectors such as
the
Pilatus for beamlines or the Bruker Photon for in-house use enable the
crystal to 
be rotated at constant velocity, eliminating the mechanical errors
associated with
'stop and go' data collection. Storing their data in 'frames' is an
artifical
construction that is currently required for the established data integration
programs but is in fact throwing away information. Maybe in 10 years time
'frames' 
will be as obsolete as punched cards!

George

On Wed, Oct 26, 2011 at 09:39:40AM +0100, Graeme Winter wrote:
 Hi James,
 
 Just to pick up on your point about the Pilatus detectors. Yesterday
 in 2 hours of giving a beamline a workout (admittedly with Thaumatin)
 we acquired 400 + GB of data*. Now I appreciate that this is not
 really routine operation, but it does raise an interesting point - if
 you have loaded a sample and centred it, collected test shots and
 decided it's not that great, why not collect anyway as it may later
 prove to be useful?
 
 Bzzt. 2 minutes or less later you have a full data set, and barely
 even time to go get a cup of tea.
 
 This does to some extent move the goalposts, as you can acquire far
 more data than you need. You never know, you may learn something
 interesting from it - perhaps it has different symmetry or packing?
 What it does mean is if we can have a method of tagging this data
 there may be massively more opportunity to get also-ran data sets for
 methods development types. What it also means however is that the cost
 of curating this data is then an order of magnitude higher.
 
 Also moving it around is also rather more painful.
 
 Anyhow, I would try to avoid dismissing the effect that new continuous
 readout detectors will have on data rates, from experience it is
 pretty substantial.
 
 Cheerio,
 
 Graeme
 
 *by data here what I mean is images, rather than information which
 is rather more time consuming to acquire. I would argue you get that
 from processing / analysing the data...
 
 On 24 October 2011 22:56, James Holton jmhol...@lbl.gov wrote:
  The Pilatus is fast, but or decades now we have had detectors that can
read
  out in ~1s.  This means that you can collect a typical ~100 image
dataset in
  a few minutes (if flux is not limiting).  Since there are ~150 beamlines
  currently operating around the world and they are open about 200
days/year,
  we should be collecting ~20,000,000 datasets each year.
 
  We're not.
 
  The PDB only gets about 8000 depositions per year, which means either we
  throw away 99.96% of our images, or we don't actually collect images
  anywhere near the ultimate capacity of the equipment we have.  In my
  estimation, both of these play about equal roles, with ~50-fold
attrition
  between ultimate data collection capacity and actual collected data, and
  another ~50 fold attrition between collected data sets and published
  structures.
 
  Personally, I think this means that the time it takes to collect the
final
  dataset is not rate-limiting in a typical structural biology
  project/paper.  This does not mean that the dataset is of little value.
   Quite the opposite!  About 3000x more time and energy is expended
preparing
  for the final dataset than is spent collecting it, and these efforts
require
  experimental feedback.  The trick is figuring out how best to compress
the
  data used to solve a structure for archival storage.  Do the previous
  data sets count?  Or should the compression be lossy about such
  historical details?  Does the stuff between the spots matter?  After
all,
  h,k,l,F,sigF is really just a form of data compression.  In fact, there
is
  no such thing as raw data.  Even raw diffraction images are a
  simplification of the 

[ccp4bb] Off-topic: DSF thermo cycler low temp limit

2011-10-26 Thread Reginald McNulty
We are trying to determine Tm value of rather unstable proteins with a Tm in
the mid 20 C range using DSF/thermofluor.  Our Stratagene thermocycler has a
low temp limit of 25 C (Peltier).  I called the company and they said it's a
'hardware limit' that cannot be changed.

1) Has anyone been able to 'hotwire' the MX3000/3005 to go below this limit?

2) Are there other thermofluor machines that allow lower starting
temperatures (say 4, 10 or 15 C)?

Best regards,
-Reggie
 



Re: [ccp4bb] IUCr committees, depositing images

2011-10-26 Thread Colin Nave
Dear Gerard

Yes, perhaps I was getting a bit carried away with the possibilities. Although 
I believe that, with high resolution detectors and low divergence beams, one 
should be able to separate out the various lattices it is not really relevant 
to the main issue - getting the best from existing data.  The point I made 
about correcting data probably comes in a similar category - taking the 
opportunity to air a favourite subject.

Regards
  Colin

PS. While here though I realise one of my points was a bit unclear. Point 5 
should be
5.  My view is that for data in the PDB the same release rules should apply 
for the images as for the other data. For data not (yet) in the PDB, the 
funders of the research might want to define release rules. However, we can 
make suggestions!
The original had For other data rather than For data not (yet) in the PDB

-Original Message-
From: Gerard Bricogne [mailto:g...@globalphasing.com] 
Sent: 26 October 2011 23:23
To: Nave, Colin (DLSLtd,RAL,DIA)
Cc: ccp4bb
Subject: Re: [ccp4bb] IUCr committees, depositing images

Dear Colin,

 Thank you for accepting the heavy burden of responsibility your
colleagues have thrown onto your shoulders ;-) . It is great that you are
entering this discussion, and I am grateful for the support you are bringing
to the notion of starting something at ground level and learning from it,
rather that staying in the realm of conjecture and axiomatics, or entering
the virility contest as to whose beamline will make raw data archiving most
impossible.

 One small point, however, about your statement regarding multiple
lattices, that 

 ...  all crystals are, to a greater or lesser extent, subject to this.
 We just might not see it easily as the detector resolution or beam
 divergence is inadequate. Just think we could have several structures
 (one from each lattice) each with less disorder rather than just one
 average structure.

I am not sure that what you describe in your last sentence is a realistic
prospect, nor that it would in any case constitute the main advantage of
better dealing with multiple lattices. The most important consequence of
their multiplicity is that their spots overlap and corrupt each other's
intensities, so that the main benefit of improved processing would be to
mitigate that mutual corruption, first by correctly flagging overlaps, then
by partially trying to resolve those overlaps internally as much as scaling
procedures will allow (one could call that non-merohedral detwinning - it
is done e.g. by small-molecule softeware), and finally by adapting
refinement protocols to recognise that they may have to refine against
measurements that are a mixture of several intensities, to a degree and
according to a pattern that varies from one observation to another (unlike
regular twinning).

 Currently, if a main lattice can be identified and indexed, one tends
to integrate the spots it successfully indexes, and to abstain from worrying
about the accidental corruption of the resulting intensities by accidental
overlaps with spots of the other lattices (whose existence is promptly
forgotten). It is the undoing of that corruption that would bring the main
benefit, not the fact that one could see several variants of the structure
by fitting the data attached to the various lattices: that would be possible
only if overlaps were negligible. The prospects for improving electron
density maps by reprocessing raw images in the future are therefore
considerable for mainstream structures, not just as a way of perhaps teasing
interestingly different structures from each lattice in infrequent cases.

 I apologise if I have laboured this point, but I am concerned that
every slight slip of the pen that makes the benefits of future reprocessing
look as if they will just contribute to splitting hairs does a disservice to
this crucial discussion (and hence, potentially, to the community) by
belittling the importance and urgency of the task.


 With best wishes,
 
Gerard (B.)

--
On Wed, Oct 26, 2011 at 07:58:51PM +, Colin Nave wrote:
 I have been nominated by the IUCr synchrotron commission (thanks colleagues!) 
 to represent them for this issue. However, at the moment, this is a personal 
 view.
 
 1. For archiving raw diffraction image data for structures in the PDB, it 
 should be the responsibility of the worldwide PDB. They are by far the best 
 place to do it and as Jacob says the space requirements are trivial. Gerard 
 K's negative statement at CCP4-2010 sounds rather ex cathedra (in increasing 
 order of influence/power do we have the Pope, US president, the Bond Market 
 and finally Gerard K?). Did he make the statement in a formal presentation or 
 in the bar? More seriously, I am sure he had good reasons (e.g. PDB 
 priorities) if he did make this statement. It would be nice if Gerard could 
 provide some explanation.
 
 2. I agree with the can do attitude at Madrid as 

[ccp4bb] unsubscribe ccp4bb

2011-10-26 Thread Hanna S . Yuan 袁小琀
 



Re: [ccp4bb] data processing problem with ice rings

2011-10-26 Thread ChenTiantian
Hi there,
Thank you for all your suggestions and generous help, I tried some methods
you guys mentioned and learned something new . I really appreciate it.
With Kay's help,(after exclusion of the ice rings he found that the data are
P1, not P2(1). ) I got my structure solved, there are four copies in the AU,
I cut the high resolution to 2.4, and now the R/Rfree is  0.2254/0.2648,
this is not the final result. I'm still working on it.
Thank you so much.

Best Regards,

Tiantian


Re: [ccp4bb] cryo protection

2011-10-26 Thread Jens Kaiser
Hey Len,
  I had this problem, too. As you know, my favorite first try is always
fomblin (no need to mix anything). I had quite a bit success in stubborn
cases to inject about 4uL fomblin through the tape on top of the drop
and then looping crystals through the oil layer. You can wick the mother
liquor off and try them right away or continue manipulation under oil

Cheers,

Jens

 On Wed, 2011-10-26 at 11:46 -0500, Leonard Thomas wrote:
 Hi All,
 
 I have run into a very sensitive crystals system when it comes to cryo  
 protecting them.  I have run through the usual suspects and trays are  
 going to be setup with a cryo protectant as part of crystallization  
 cocktail.  The one problem that  seems to be occurring is that the  
 crystals crack as soon as they are transfered out of the original  
 drop.  I am running out of ideas and really would love some new ones.
 
 Thanks in advance.
 
 Len
 
 Leonard Thomas Ph.D.
 Macromolecular Crystallography Laboratory Manager
 University of Oklahoma
 Department of Chemistry and Biochemistry
 Stephenson Life Sciences Research Center
 101 Stephenson Parkway
 Norman, OK 73019-5251
 
 lmtho...@ou.edu
 http://barlywine.chem.ou.edu
 Office: (405)325-1126
 Lab: (405)325-7571


Re: [ccp4bb] Off-topic: DSF thermo cycler low temp limit

2011-10-26 Thread Jürgen Bosch
CD spec with Pelletier is an option too

Jürgen 

..
Jürgen Bosch
Johns Hopkins Bloomberg School of Public Health
Department of Biochemistry  Molecular Biology
Johns Hopkins Malaria Research Institute
615 North Wolfe Street, W8708
Baltimore, MD 21205
Phone: +1-410-614-4742
Lab:  +1-410-614-4894
Fax:  +1-410-955-3655
http://web.mac.com/bosch_lab/

On Oct 26, 2011, at 19:08, Reginald McNulty rmcnu...@uci.edu wrote:

 We are trying to determine Tm value of rather unstable proteins with a Tm in
 the mid 20 C range using DSF/thermofluor.  Our Stratagene thermocycler has a
 low temp limit of 25 C (Peltier).  I called the company and they said it's a
 'hardware limit' that cannot be changed.
 
 1) Has anyone been able to 'hotwire' the MX3000/3005 to go below this limit?
 
 2) Are there other thermofluor machines that allow lower starting
 temperatures (say 4, 10 or 15 C)?
 
 Best regards,
 -Reggie
  
 


[ccp4bb] twist angle bwtween monomers

2011-10-26 Thread Debajyoti Dutta
Hi all,

Does anybody know of any software to calculate the twist angle between two 
monomers in a dimeric assembly.

Or calculate manually.

Thank you in advance.

Sincerely

Debajyoti

Re: [ccp4bb] twist angle bwtween monomers

2011-10-26 Thread Ed Pozharski
Assuming you are dealing with a pure twist, isn't the polar rotation
angle reported by lsqkab or superpose what you are looking for?

On Thu, 2011-10-27 at 04:16 +, Debajyoti Dutta wrote:
 Hi all,
 
 Does anybody know of any software to calculate the twist angle between
 two monomers in a dimeric assembly.
 
 Or calculate manually.
 
 Thank you in advance.
 
 Sincerely
 
 Debajyoti
 
 
 Treat yourself at a restaurant, spa, resort and much more with Rediff
 Deal ho jaye!


Re: [ccp4bb] cryo protection

2011-10-26 Thread James Holton
I have always been a fan of oil, which has already been suggested.  Have 
you tried that?


Cross-linking has already been suggested, and these are some good protocols:
Lusty (1999) J. Appl. Crystallogr. 32, 106-112.
McWhirter, et al. (1999) PNAS USA 96, 8408-8413.

In the latter paper the crystals cracked immediately upon breaking the 
seal on the cover slip (limbo trays).  The cross linker was introduced 
with a Hamilton syringe via a pre-cut and grease-filled hole in the 
crystallization chamber.  That way there were no mechanical vibrations 
at all.  You do NOT need to add it directly to the drop.  Gluteraldehyde 
has sufficient vapor pressure to permeate slowly into it.  After a few 
days, the crystals were incredibly robust, and gave the best 
diffraction.  The only problem with gluteraldehyde is if you have 
primary amines in your buffer, such as tris.  If that is the case, you 
can usually substitute bis-tris, or just use a different kind of 
crosslinker.


Another trick I like if you have a cryo component in the mother liquor 
(protein counts) is to just let the drop dry up slowly.  You can keep 
sampling it with a small loop (removes ~1 nl) until you see it 
flash-cool clear.  Then, if the crystals survived, you can flash-cool 
them in the dried-down mother liquor.  This worked for me once with 
crystals that just didn't want to transfer into anything.


-James Holton
MAD Scientist

On 10/26/2011 9:46 AM, Leonard Thomas wrote:

Hi All,

I have run into a very sensitive crystals system when it comes to cryo 
protecting them.  I have run through the usual suspects and trays are 
going to be setup with a cryo protectant as part of crystallization 
cocktail.  The one problem that  seems to be occurring is that the 
crystals crack as soon as they are transfered out of the original 
drop.  I am running out of ideas and really would love some new ones.


Thanks in advance.

Len

Leonard Thomas Ph.D.
Macromolecular Crystallography Laboratory Manager
University of Oklahoma
Department of Chemistry and Biochemistry
Stephenson Life Sciences Research Center
101 Stephenson Parkway
Norman, OK 73019-5251

lmtho...@ou.edu
http://barlywine.chem.ou.edu
Office: (405)325-1126
Lab: (405)325-7571