-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Dear Deepthi,
is it just a typo, or do your last two sentences say that your data DO
NOT scale in P312 but scale well in P321?
Did you try pointless for space group determination? I have not used
molrep for this purpose and cannot judge how reliable
Hi David
I'm curious - do you mean running on a 32-bit Centos box or running
the 32-bit Mosflm executable on a 64-bit Centos box?
We did have one report of problems with the 32-bit exe on a 64-bit
box, which (seemingly) randomly gave one of two different results
(either the same failure
Hi,
On Wed, Apr 04, 2012 at 02:07:58PM -0700, Deepthi wrote:
Hello everyone
I have a problem scaling the MAD data which was collected a week ago.The
data was collected at 1.5A resolution using three wavelengths for Zn-MAD
experiments. Scaling the data for MAD experiments, the number of
Dear Bernard,
arp_waters is a very old code and it gets even older as we speak.
Try to use ARP/wARP version 7.2, where you can run the same task:
from the command line ($warpbin/auto_solvent.sh)
from the CCP4i GUI (ARP/wARP Solvent)
from ArpNavigator (Model Solvent)
There should be both 32 and
Dear 'aales...@burnham.org',
Re the pixel detector; yes this is an acknowledged raw data archiving
challenge; possible technical solutions include:- summing to make
coarser images ie in angular range, lossless compression (nicely
described on this CCP4bb by James Holton) or preserving a
[Cross-posted from the 3DEM mailing list.]
--Gerard
-- Forwarded message --
Date: Wed, 4 Apr 2012 16:34:39 +0100
From: Helen Saibil h.sai...@mail.cryst.bbk.ac.uk
To: 3DEM Mailing List 3...@ncmir.ucsd.edu
Subject: [3dem] CCP-EM positions now available
Dear Colleagues,
We have
Dear Colleagues,
Clearly, no system will be able to perfectly preserve every pixel of
every dataset collected at a cost that can be afforded. Resources are
finite and we must set priorities. I would suggest that, in order
of declining priority, we try our best to retain:
1. raw data that
FYI, every NSF grant proposal now must have a data management plan that
describes how all experimental data will be archived and in what formats.
I'm not sure how seriously these plans are monitored, but a plan must be
provided nevertheless. Is anyone NOT archiving their original data in some
way?
I would say everybody keeps probably too many junk datasets around - at least I
do. And I run into the trouble of having to buy new TB plates every now and
then.
I think on average per year my group acquires currently ~700 GB of raw images
(compressed), now if we were to only keep the useful
Dear Herbert,
Category 4, in Manchester, we find is tricky, for want of a better word.
Needless to say that we have collaborators on our Crystallography Research
Service who request data sets from eg ten years ago, that are now urgent for
publication writing up. So we are keeping everything,
Dear Roger,
At the recent ICSTI Workshop on Delivering Data in science the NSF presenter,
when I asked about monitoring, replied that the PIs' annual reports should
include data management aspects.
See http://www.icsti.org/spip.php?rubrique42
Best wishes,
John
Prof John R Helliwell DSc FInstP
It seems that deposition of map coefficients is a good idea. Does someone have
an mtz2cif that can handle this?
Thanks!
F
-
Francis E. Reyes M.Sc.
215 UCB
University of Colorado at Boulder
Have you tried mtz2various (with cif output)?
Pete
Francis E Reyes wrote:
It seems that deposition of map coefficients is a good idea. Does someone have an mtz2cif that can handle this?
Thanks!
F
-
Francis E. Reyes M.Sc.
215 UCB
University of
On Thursday, April 05, 2012 08:25:05 am Francis E Reyes wrote:
It seems that deposition of map coefficients is a good idea.
Does someone have an mtz2cif that can handle this?
Maybe I missed something.
What is accomplished by depositing map coefficients that isn't
done better by depositing Fo
I have not tried it, but the latest version of the rcsb
program sf-convert is supposed to support it
(see version 1.2 released March 23)
http://sw-tools.pdb.org/apps/SF-CONVERT/index.html
http://sw-tools.pdb.org/apps/SF-CONVERT/doc/V1-2-00/documentation.html
(Version 1.2 is not yet available as
Fc doesn't contain the weighting scheme used in the creation of the map
coefficients, so Fc would require some sort of program to be run to
recreate those for both 2Fo-Fc and Fo-Fc maps. By which time you might
as well run a single cycle of the refinement program in question to
generate new
Hello
I arrived at the p312 space group by running a self rotation function using
MOLREP. The maps show the space group as p312. I was scaling the data
individually for each wavelength. None of the three wavelengths are scaling
are scaling in p312 space group.
On Thu, Apr 5, 2012 at 2:17 AM,
Hi -
Let me just add that P312 is a very uncommon space group for protein
crystals, much less common than P321. (This doesn't mean you don't have
it - it's just unlikely.) If you look at PDB statistics:
P 3 1 2 : 12 structures
P3(1) 1 2: 61 structures
P3(2) 1 2: 85 structures
P 3 2 1 :
On Thursday, April 05, 2012 09:30:25 am Phil Jeffrey wrote:
Fc doesn't contain the weighting scheme used in the creation of the map
coefficients, so Fc would require some sort of program to be run to
recreate those for both 2Fo-Fc and Fo-Fc maps.
The viewers I am familiar with do this for
On Thu, 5 Apr 2012, Ethan Merritt wrote:
On Thursday, April 05, 2012 09:30:25 am Phil Jeffrey wrote:
Fc doesn't contain the weighting scheme used in the creation of the map
coefficients, so Fc would require some sort of program to be run to
recreate those for both 2Fo-Fc and Fo-Fc maps.
The
Hi,
I would like to advertise a position on behalf of Prof. Lois Weisman
((Michigan).
Interested parties should contact her directly
(lweisman.off...@gmail.commailto:lweisman.off...@gmail.com).
-Amir
Postdoctoral Fellow Position
Integration of high resolution structures with biology
Seeking a
On Thursday, April 05, 2012 10:48:16 am Oliver Smart wrote:
On Thu, 5 Apr 2012, Ethan Merritt wrote:
On Thursday, April 05, 2012 09:30:25 am Phil Jeffrey wrote:
Fc doesn't contain the weighting scheme used in the creation of the map
coefficients, so Fc would require some sort of program
Dear John,
Thank you for a very informative letter about the IUCr activities towards
archiving the experimental data. I feel that I did not explain myself properly.
I do not object archiving the raw data, I just believe that current methodology
of validating data at PDB is insufficiently robust
Can MOSFLM work with image files of type .x (BNL X6A) ? I am having no luck...
I know it can do .cbf (BNL X25) for instance.
Thanks a lot
There is an immediate opening for a protein crystallographer position in
Harvard Medical Schools Childrens Hospital. The research is focused on
the structural and functional investigation of Wnt signaling pathway. The
project is the close collaborative efforts between Professors Xi He and
Postdoctoral positions are available in the Cell Biology and
Biophysics Unit headed by Dr. Antonina Roll-Mecak at the National
Institute of Neurological Disorders and Stroke. The Roll-Mecak
Laboratory is interested in understanding the interplay between
microtubules and their regulators
thanks for kindly pointing that out. (despite the level of stupidity on my part)
Those were not the raw imgs... They were denzo output files. Its been a while.
This discussion has been interesting, and it's provided an interesting forum
for those interested in dealing with fraud in science. I've not contributed
anything to this thread, but the message from Alexander Aleshin prodded me to
say some things that I haven't heard expressed before.
1. The
I also don't really worry about the images as a primary means of fraud
prevention, although such may be
a useful side effect. These cases are spectacular but so rare that it indeed
would not primarily justify the effort.
That it can be a useful political instrument to make that argument and get
Well, looks like my opinion about importance of data validation at the moment
of their submission does not catch much support, it is sad but understandable.
Automatic redoing the pdb structures by professionals is a good idea, I myself
suggested a similar thing 10 years ago at Accelrys (we
Ojweh
c) Discarding your primary data is generally considered bad form...
Agreed, but it is a big burden on labs to maintain archives of their raw
data indefinitely.
Even IRS allows to discard them after some time.
But you DO have to file in the first place, right? How long to keep is an
Alright, if the image deposition is the only way out, then I am for it, but
please make sure that synchrotrons will do it for me...
On Apr 5, 2012, at 7:58 PM, Bernhard Rupp (Hofkristallrat a.D.) wrote:
Ojweh
c) Discarding your primary data is generally considered bad form...
Agreed,
How should they ?
They have no clue which of the 20 datasets was actually useful to solve your
structure.
If you ask James Holton he has (suggested) to go back to the archived data
after a certain time and try to solve the undeposited structures then :-)
[Where is James anyhow ? Haven't seen a
Did you play as a child a game called a broken phone? It is when someone
tells something quickly to a neighbor, and so on until the words come back to
the author. Very funny game.
My original thesis was that downloading/depositing the raw images would be a
pain in the neck for
34 matches
Mail list logo