Dear John and colleagues,

     There seem to be a set a centrifugal forces at play within this thread
that are distracting us from a sensible path of concrete action by throwing
decoys in every conceivable direction, e.g.

     * "Pilatus detectors spew out such a volume of data that we can't
possibly archive it all" - does that mean that because the 5th generation of
Dectris detectors will be able to write one billion images a second and
catch every scattered photon individually, we should not try and archive
more information than is given by the current merged structure factor data?
That seems a complete failure of reasoning to me: there must be a sensible
form of raw data archiving that would stand between those two extremes and
would retain much more information that the current merged data but would
step back from the enormous degree of oversampling of the raw diffraction
pattern that the Pilatus and its successors are capable of.

     * "It is all going to cost an awful lot of money, therefore we need a
team of grant writers to raise its hand and volunteer to apply for resources
from one or more funding agencies" - there again there is an avoidance of
the feasible by invocation of the impossible. The IUCr Forum already has an
outline of a feasibility study that would cost only a small amount of
joined-up thinking and book-keeping around already stored information, so
let us not use the inaccessibility of federal or EC funding as a scarecrow
to justify not even trying what is proposed there. And the idea that someone
needs to decide to stake his/her career on this undertaking seems totally
overblown.

     Several people have already pointed out that the sets of images that
would need to be archived would be a very small subset of the bulk of
datasets that are being held on the storage systems of synchrotron sources.
What needs to be done, as already described, is to be able to refer to those
few datasets that gave rise to the integrated data against which deposited
structures were refined (or, in some cases, solved by experimental phasing),
to give them special status in terms of making them visible and accessible
on-line at the same time as the pdb entry itself (rather than after the
statutory 2-5 years that would apply to all the rest, probably in a more
off-line form), and to maintain that accessibility "for ever", with a link
from the pdb entry and perhaps from the associated publication. It seems
unlikely that this would involve the mobilisation of such large resources as
to require either a human sacrifice (of the poor person whose life would be
staked on this gamble) or writing a grant application, with the indefinite
postponement of action and the loss of motivation this would imply.

     Coming back to the more technical issue of bloated datasets, it is a
scientific problem that must be amenable to rational analysis to decide on a
sensible form of compression of overly-verbose sets of thin-sliced, perhaps
low-exposure images that would already retain a large fraction, if not all,
of the extra information on which we would wish future improved versions of
processing programs to cut their teeth, for a long time to come. This
approach would seem preferable to stoking up irrational fears of not being
able to cope with the most exaggerated predictions of the volumes of data to
archive, and thus doing nothing at all.

     I very much hope that the "can do" spirit that marked the final
discussions of the DDDWG (Diffraction Data Deposition Working Group) in
Madrid will emerge on top of all the counter-arguments that consist in
moving the goal posts to prove that the initial goal is unreachable.


     With best wishes,
     
          Gerard.

--
On Wed, Oct 26, 2011 at 02:18:25PM +0100, John R Helliwell wrote:
> Dear Frank,
> re 'who will write the grant?'.
> 
> This is not as easy as it sounds, would that it were!
> 
> There are two possible business plans:-
> Option 1. Specifically for MX is the PDB as the first and foremost
> candidate to seek such additional funds for full diffraction data
> deposition for each future PDB deposiition entry. This business plan
> possibility is best answered by PDB/EBI (eg Gerard Kleywegt has
> answered this in the negative thus far at the CCP4 January 2010).
> 
> Option 2 The Journals that host the publications could add the cost to
> the subscriber and/or the author according to their funding model. As
> an example and as a start a draft business plan has been written by
> one of us [JRH] for IUCr Acta Cryst E; this seemed attractive because
> of its simpler 'author pays' financing. This proposed business plan is
> now with IUCr Journals to digest and hopefully refine. Initial
> indications are that Acta Cryst C would be perceived by IUCr Journals
> as a better place to start considering this in detail, as it involves
> fewer crystal structures than Acta E and would thus be more
> manageable. The overall advantage of the responsibility being with
> Journals as we see it is that it encourages such 'archiving of data
> with literature' across all crystallography related techniques (single
> crystal, SAXS, SANS, Electron crystallography etc) and fields
> (Biology, Chemistry, Materials, Condensed Matter Physics etc) ie not
> just one technique and field, although obviously biology is dear to
> our hearts here in the CCP4bb.
> 
> Yours sincerely,
> John and Tom
> John Helliwell  and Tom Terwilliger
> 
> On Wed, Oct 26, 2011 at 9:21 AM, Frank von Delft
> <frank.vonde...@sgc.ox.ac.uk> wrote:
> > Since when has the cost of any project been limited by the cost of
> > hardware?  Someone has to implement this -- and make a career out of it;
> > thunderingly absent from this thread has been the chorus of volunteers who
> > will write the grant.
> > phx
> >
> >
> > On 25/10/2011 21:10, Herbert J. Bernstein wrote:
> >
> > To be fair to those concerned about cost, a more conservative estimate
> > from the NSF RDLM workshop last summer in Princeton is $1,000 to $3,000
> > per terabyte per year for long term storage allowing for overhead in
> > moderate-sized institutions such as the PDB.  Larger entities, such
> > as Google are able to do it for much lower annual costs in the range of
> > $100 to $300 per terabyte per year.  Indeed, if this becomes a serious
> > effort, one might wish to consider involving the large storage farm
> > businesses such as Google and Amazon.  They might be willing to help
> > support science partially in exchange for eyeballs going to their sites.
> >
> > Regards,
> >    H. J. Bernstein
> >
> > At 1:56 PM -0600 10/25/11, James Stroud wrote:
> >
> > On Oct 24, 2011, at 3:56 PM, James Holton wrote:
> >
> > The PDB only gets about 8000 depositions per year
> >
> > Just to put this into dollars. If each dataset is about 17 GB in
> > size, then that's about 14 TB of storage that needs to come online
> > every year to store the raw data for every structure. A two second
> > search reveals that Newegg has a 3GB hitachi for $200. So that's
> > about $1000 / year of storage for the raw data behind PDB deposits.
> >
> > James
> >
> >
> 
> 
> 
> -- 
> Professor John R Helliwell DSc

-- 

     ===============================================================
     *                                                             *
     * Gerard Bricogne                     g...@globalphasing.com  *
     *                                                             *
     * Global Phasing Ltd.                                         *
     * Sheraton House, Castle Park         Tel: +44-(0)1223-353033 *
     * Cambridge CB3 0AX, UK               Fax: +44-(0)1223-366889 *
     *                                                             *
     ===============================================================

Reply via email to