The archiving of all raw data and subsequently making it public is
something that the large facilities are currently debating whether to
do. Here at the ESRF we store user data for only 6 months (and I
believe that it is available longer on tape) and we already have trouble
with capacity. My personal view is that facilities should take the lead
on this - for MX we already have a very good archiving system - ISPyB -
also running at Diamond. ISPyB stores lots of meta data and jpgs of the
raw images but not the images themselves but a link to the location of
the data with an option to download if still available. My preferred
option would be to store all academically funded data and then make it
publicly available after say 2-5 years (this will no doubt spark another
debate on time limits, special dispensation etc). What needs to be
thought about is how to order the data and how to make sure that the
correct meta data are stored with each data set - this will rely heavily
on user input at the time of the experiment rather than gathering
together data sets for depositions much later. As already mentioned,
this type of resource could be extremely useful for developers and also
as a general scientific resource. Smells like an EU grant to me.
Cheers, Matt.
On 26/10/2011 10:21, Frank von Delft wrote:
Since when has the cost of any project been limited by the cost of
hardware? Someone has to /implement /this -- and make a career out of
it; thunderingly absent from this thread has been the chorus of
volunteers who will write the grant.
phx
--
Matthew Bowler
Structural Biology Group
European Synchrotron Radiation Facility
B.P. 220, 6 rue Jules Horowitz
F-38043 GRENOBLE CEDEX
FRANCE
===================================================
Tel: +33 (0) 4.76.88.29.28
Fax: +33 (0) 4.76.88.29.04
http://go.esrf.eu/MX
http://go.esrf.eu/Bowler
===================================================