Just to add to Raji and Dave¹s comments.  It seems to me that over the last
5-10 years learning theory has indeed been discouraged, takes up to much
time etc.  This can be seen in changes that have occurred to the advice
given when writing up a thesis.  People used to write a theory chapter where
they outlined the theory behind the methods they had used.  This showed that
a student had at least some grasp as to what was going on.  In my case at
least this has proved a useful memory tool as it¹s written in my own hand
and is useful when brushing up on theory.  When I was writing up, however,
the usual advice was not to bother as you will only get asked about it in
the viva.  This led many people to avoid theory.
As the techniques used by a crystallographer increase the ability to go into
depth with every skills theory diminishes.  Many of us do some EM, NMR,
fluorescence, kinetics as well as protein purification and molecular biology
and I¹m sure many others.  The temptation to be a Œjack of all trades master
of none¹ is clearly there but it will have a deleterious affect on quality.
When people do PhD¹s they usually concentrate on one or two techniques.  In
my opinion people should be required to at least have gone into depth on
those techniques for their PhD.  This will at least give people a solid
grounding in their initial specialty.
I¹m not saying that everyone should be a grand master of the dark arts, I¹m
certainly not. But a good basis means you can deal with problems in a valid
way.

Amb


On 30/8/07 12:38, "David Briggs" <[EMAIL PROTECTED]> wrote:

> I'm going to agree with Raji's observations, and fan the flames of his point a
> little.
> 
> I count myself as lucky that I have had access to certain people during my
> crystallographic training who had a good understanding of the theory  behind
> crystallography (hopefully I have exploited this luck sufficiently). Despite
> their tutelage, I will hold my hands up and admit that certain technical
> discussions on the bb leave me a little confused occasionally...
> 
> However, I have seen what Raji described going on around me, and it is pretty
> prevalent. Structures are pushed through sometimes, without the PhD student
> really knowing quite what's happened. I cut my teeth on a few structures that
> didn't have the sort of pressure on them that others had, and this allowed me
> to get to grips with what was going on. I also had a few real pigs of projects
> - you tend to learn a lot more when stuff goes wrong - if you bang your data
> through program X and get textbook maps & stats, you haven't really learnt
> anything - if you've struggled with Molecular replacement, your SeMet won't
> crystallise and your heavy atoms won't stick, then you tend to learn how to
> make Phaser run the last half yard etc etc - and that half yard often come
> about from thinking about your problems in the right way - having an
> "old-school" crystallographer to bang ideas off can be invaluable at this
> point. Learning the theory is not always encouraged, and, given that to do it
> properly takes some time and application, it is often towards the bottom of
> the priority list. It would take a ballsy student to say to their boss, "no I
> can't do experiments X,Y & Z until I have read & understood this paper on
> Maximum Likelihood!"
> 
> However, in a system in which there exists a fair amount of pressure and
> competition (exacerbated in the US system, I think), the temptation to hand
> off ALL data to a "structure-solver" can be great. However, if this practice
> continues, as Raji suggests, there will be a lack of properly trained
> crystallographers - and mistakes will be more likely to occur.
> 
> The suggestion of explicitly stating "X crystallised, Y collected data, Z
> phased and refined" in a paper is good one, and some journals (eg Nature) like
> an "author contributions" section. However, if a group is willing to
> 'overlook' problems in their data as recently seen, maybe they cannot be
> trusted to make these statements accurately.
> 
> I think, that the only water-tight way of preventing such mistakes again is to
> have every paper that contains a structure to be reviewed by at least one
> properly-trained crystallographer, and to have the data (pdbs & SFs) made
> available to them.
> 
> just my lunchtime ramble...
> 
> Dave
> 
> On 29/08/2007, Raji Edayathumangalam <[EMAIL PROTECTED]
> <mailto:[EMAIL PROTECTED]> > wrote:
>> I would like to mention some other issues now that Ajees et al. has stirred
>> all sorts of 
>> discussions. I hope I haven't opened Pandora's box.
>> 
>> From what I have learned around here, very often, there seems to be little
>> time allowed or allocated
>> to actually learn--a bit beyond the surface--some of the crystallography or
>> what the 
>> crystallographic software is doing during the structure solution process.
>> 
>> A good deal of the postdocs and students here are under incredible pressure
>> to get the structure
>> DONE asap. For some of them, it is their first time solving a crystal
>> structure. Yes, the same
>> heapful of reasons: because it's "hot", "competitive", grant deadline, PI
>> tenure pressure etc. etc.
>> Learning takes the backseat and this is total rubbish and very scary, in my
>> biased personal opinion.
>> Although I think it is the person's responsibility to take the time and
>> initiative to learn, I also
>> see that the pressure often is insurmountable. Often, the PI and/or assigned
>> "structure solver" in
>> the lab pretty much takes charge at some early stage of structure
>> determination and solves the
>> structure with much lesser contribution from the scientist in training
>> (student/postdoc). All that
>> slog to clone, purify, crystallize, optimize diffraction only to realize
>> someone else will come
>> along, process the data and "finish up" the structure for you. Such
>> 'training' (or lack thereof) is
>> a recipe for generating 'bad' structures in future and part of the reason for
>> this endless thread.
>> 
>> I think it is NOT as common for someone else to, say, run all the Western
>> blots for you, maintain
>> your tissue cell lines for you, do your protein preps for you. Is it because
>> it is much easier to
>> upload someone else's crystallographic data on one's machine and solve the
>> structure (since this
>> does not demand the same kind of physical labor and effort and is also a lot
>> of fun) that this
>> happens? I understand when the PI or "structure solver" does the above as
>> part of a teamwork and
>> allows for the person in question to learn. But often, I see the person is
>> somewhat left overwhelmed
>> and clueless in the end.
>> 
>> I bring this issue to the forum since I do not know if this phenomenon is
>> ubiquitous. If this
>> practice is a rampant weed, can we as a crystallographic community place some
>> measures to stanch
>> such practices?
>> 
>> How about ALL journals explicitly listing who did what during the
>> crystallographic analysis? Is
>> there a practical solution?
>> 
>> I suspect that what I describe is not merely anecdotal. Any solutions?
>> Raji
>> 
>> 
>> 
>> 
>> ------
>> Date:           Thu, 23 Aug 2007 16:17:23 -0700
>> Reply-To:       Dale Tronrud <[log in to unmask]>
>> Sender:         CCP4 bulletin board <[log in to unmask]>
>> From:           Dale Tronrud <[log in to unmask]>
>> Subject:        Re: The importance of USING our validation tools
>> In-Reply-To:    <[log in to unmask]>
>> Content-Type:           text/plain; charset=ISO-8859-1; format=flowed
>> 
>> In the cases you list, it is clearly recognized that the fault lies with the
>> investigator and not
>> the method. In most of the cases where serious problems have been identified
>> in published models the
>> authors have stonewalled by saying that the method failed them.
>> 
>> "The methods of crystallography are so weak that we could not detect (for
>> years) that our program
>> was swapping F+ and F-."
>> 
>> "The scattering of X-rays by bulk solvent is a contentious topic."
>> 
>> "We should have pointed out that the B factors of the peptide are higher then
>> those of the protein."
>> 
>> It appears that the problems occurred because these authors were not
>> following established
>> procedures in this field. They are, as near as I can tell, somehow immune
>> from the consequences of
>> their errors. Usually the paper isn't even retracted, when the model is
>> clearly wrong. They can dump
>> blame on the technique and escape personal responsibility. This is what
>> upsets so many of us.
>> 
>> It would be so refreshing to read in one of these responses "We were under a
>> great deal of pressure
>> to get our results out before our competitors and cut corners that we
>> shouldn't have, and that
>> choice resulted in our failure to detect the obvious errors in our model."
>> 
>> If we did see papers retracted, if we did see nonrenewal of grants, if we did
>> see people get fired,
>> if we did see prison time (when the line between carelessness and fraud is
>> crossed), then we could
>> be comforted that there is practical incentive to perform quality work.
>> 
>> Dale Tronrud
> 
> 


Reply via email to