Re: [ccp4bb] Open position - data management in biophysics

2021-01-20 Thread Markus Heckmann
Dear PI s, and senior scientists' involved in recruitment,

Why do so many (especially postdoc) positions these days indicate:

> Readiness for high workload

> able to work independently but also effectively and collaboratively with 
> other lab member

> Candidates should have a documented publication record in peer-reviewed 
> journals, able to work both independently and as an effective team member.

Do the candidates need to subtly understand that they need to work on
weekends or holidays? And what does it mean by independently and
collaboratively at the same time. Or is this a template from HR
departments.

Was it always like this in science world or we too need to work like
amazon warehouse workers (you can google it and see the pain)?

Saddened...

Mark
(not trying to point out any single PI/person but overall it is the
same words repeated...)



> We are opening a new position for an upcoming European project.
>
> *We are looking for an expert in scientific programming with experience in
> scientific data processing for a European project focused on Standards for
> Data Archival and Exploitation. *
>
> Job description:
>
> We offer attractive work connected to development of data management
> infrastructure for biophysical data in the frame of an international
> project at the Institute of Biotechnology in the centre of excellence
> Biocev. The main responsibility lies in definition of data standards and
> models for biophysical data, development of algorithms, design of user
> interface, and realization of a pilot database of biophysical data. The
> person is expected to actively participate in multilateral international
> negotiations, to drive the tasks fulfillment in collaboration with the
> local international partners, and to present the results.
>


> Dear all,
>
> Two postdoctoral positions are available in the laboratory of Dr. Pengxiang
> Huang, Assistant Professor and CPRIT scholar in cancer research in the
> Department of Molecular and Cellular Biology at Baylor College of Medicine.
> With the long-standing interest in sterol lipids, the Huang lab investigates
> the poorly understood mechanisms involved in Hedgehog (Hh) and Wnt signal
> transduction, two related pathways that play critical roles in development,
> regeneration and cancer. We utilize a combination of biochemistry, cell,
> chemical and structural biology approaches, including both X-ray
> crystallography and Cryo-EM. Our recent work identified cholesterol as the
> endogenous ligand for Smoothened, the key signal transduer and oncoprotein
> in the Hh pathway (Cell. 166:1176-87). We also characterized the structural
> and oncogenic basis of Smoothened activation, demonstrating for the first
> time the active conformation of a class F GPCR (Cell. 174:312-24). The
> highly interdisciplinary and collaborative environment of our group will
> thus provide unique career development opportunities for future postdoctoral
> trainees.
>
> We are seeking highly-motivated candidates with a Ph.D. in biomedical
> sciences and significant experience in molecular biology, protein
> biochemistry and/or cell biology. Prior knowledge in structural biology
> (X-ray crystallography or Cryo-EM) is highly desirable, but not required.
> Candidates should have a documented publication record in peer-reviewed
> journals, able to work both independently and as an effective team member.
>
> To apply, please send your CV, a short summary of research experience, and
> three reference letters to cnssolve[at]gmail.com.
>
>

> collaborative research environment focused on structural/molecular mechanism
> of broadly neutralizing antibodies specific to MPER segment of the gp41
> subunit in membrane environments, its implication to immunogen design and
> immunogenicity. The laboratory provides a rich training environment and
> access to cutting-edge techniques.
>
> Highly-motivated candidates with a recent PhD or MD/PhD in biomedical
> sciences and significant experience in molecular biology and biochemistry
> are encouraged to apply. Skills in molecular biology, biochemistry and
> extensive biomolecular NMR experience are essential.  Candidates should have
> a documented publication record in peer-reviewed journals. We are seeking a
> candidate with excellent writing and communication skills, able to work
> independently but also effectively and collaboratively with other lab
> members. Applicants should include a short statement of research goals, CV
> and three references with their application.
>



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


Re: [ccp4bb] not solely pdb issue: need someone to officially settle the pdb dispute

2019-08-22 Thread Markus Heckmann
Dear Flemming,

On 8/21/19, Flemming Goery  wrote:
> I find the message in my original e-mail has changed, perhaps by hackers,

What do u mean by 'hackers' BTW!!!


>
> Dear all:
> A has sought a job in the lab of B. B invited A for a interview with a PPT
> oral presentation, as requested A has sent the PPT on the structural biology
> research of XXX to B by e-mail, and presented in front of B and his
> postdoctoral researcher.
>
> After interview, B requested all research documents (including detailed
> reports, all done by A) on XXX to be sent by A to B by e-mail, A sent,
> including 2 sets of pdb for the same structure, one set with solvent, one
> without. A told B all intellectual property of the Documents and the
> research belonged to A, based on the regulation of A's institute.
>

Who was the boss/PI of A?

If  A  did transfer all  intellectual property to B then it is already
'game over for A'.


> B sought a referee from A's institute, to someone A did not agree. It seems
> the referee told B one set of PDB has been deposited (the one without
> solvent, also completed by A)
>
> Then B did not give the offer to A. A joined Institute D, without
> independent funding for the writing (in fact, no salary to support this
> writing, and no fee for publication of this work).

While one could sympathize A, it has no real effect on the claim.

>
> Several years later, A found B's paper, i.e., the concerned paper published

> in Journal C. In the paper, B has used the information from deposited PDB
> for 9 times (already a significant paprt of the paper, not to say the
> message from the other Documents sent to B by A). In the paper, it write
> something like, 'based on our work on the structure of  (folowed by 4 letter
> pdb code)', which implied the structure was solved by the authors of the
> paper, rather than by A.
>
> A contacted Journal C, Journal C contacted B, B claimed the deposited PDB
> was a public domain knowldge. Journal C took the action to add the reference
> to the deposited pdb in the paper.

--> Wait - who deposited the model?
--> Did B deposit model without including A?
--> Can you mention the PDB code?  :-)

> As mentioned, the paper has mentioned and used the message from the
> deposited pdb 9 times, and in the paper the reference mark was not added to
> the first occurence of the mentioning of the deposited pdb, but added (only
> once in total for the 9 occurences of depositation code) to a paragraph
> where it can be concluded that the authors have used the undeposited pdb
> with the solvent. In another word, although reference to the deposited pdb
> was added by a correction, from where the reference mark was added, it
> cannot show they have refered to the cited pdb (completed by A), not to say
> the undeposited pdb with solvent which they used based on the paragraph
> information.
>
> A's concern was that: A cannot exclude the possibility that the research in
> the paper other the part related to PDB, i.e., the part done in B's lab used
> in the paper, were fabricated by the current paper authors, thus A request
> paper retraction as the major claim.

Not sure about that. May be you contact people at
 https://retractionwatch.com/
https://twitter.com/retractionwatch
They may have more experience with these issues.



> If cannot retratcted, A request to be the correspondence author (sometimes
> requets co-first author, sometimes request both co-first author and
> co-correspondence author), as without A's work (the PPT presentation, 2 sets
> of pdb, all documents), the work in the concerned paper cannot be done. A
> regard as having contributed to the initiation of the paper, thus A prefer
> to be add as a co-correspondence author if appropriate.
>
> First, can the paper deserve a retraction, and second, can A deserve a
> co-author?



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CCP4BB=1


[ccp4bb] covalent link to aa

2019-01-15 Thread Markus Heckmann
Dear all,
I have a structure where CYS is bound to OCA (octanoic acid). I
started with Jligand and created a CYS-OCA covalent link. It then
output a CIF file (verified looks OK). Add the LINK record to PDB
file:

LINK SG  CYS A 161 C1  OCA A 902CYS-OCA

Place OCA to the model using Coot. When I now use RealSpaceRefinement,
the OCA is placed on top of CYS itself. Coot does not recognise that
OCA-CYS link but why? therefore, I just place OCA at Place it in
*reasonable* position without any overlaps.


All files are located at
https://gist.github.com/ort163/e6b8411f45e5224f7d46fc36c3324e6f

Now in REFMAC, input MTZ,PDB and the CIF file (from jligand). After
refinement, I find that the oxygen in the OCA is pointing wrongly. The
thioester bond is also NOT ideal distance (though density is good).

Is there anything wrong with my procedure? *Shouldnt* the refinement
optimise everything in OCA-CYS link?

Best
Markus



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CCP4BB=1


Re: [ccp4bb] buying a cluster

2018-12-02 Thread Markus Heckmann
Hi Graeme,

I suspect that this conclusions depends very closely on (i) the shape of
> the problem and (ii) the extent to which the binary has been optimised for
> the given platform.
>
> I do hope some of these info are analyzed and either published or at least
put at ccp4 wiki.

I am pretty sure that there are some applications (heavily threaded, making
> extensive use of vector operations) which would be massively quicker on
> 2018 hardware than something a decade old. Certainly though, if you are
> comparing a not-highly-optimised single threaded binary then your
> conclusion is probably a valid one
>
> I really request all the program developers (in the ccp4bb) to clearly
have a table in the website mentioning if certain program is purely GHz
dependent and not multi-threaded.




> Also how much power the machines take to get work done is a non-trivial
> factor…


But what about the environment? Trashing a decent machine from 2015 for the
latest threadripper2? These old maches have 80-90 + gold power supply. Many
(like Apple's planned obsolescence) are *forcibly* destroyed not
refurbished at all.

Does DIALS run that much quicker? How much time is saved for a phd student
in their career if data processing speeds up from 15 min to 10 min?
 Sure perfect for use @synchrotron but otherwise?

May the beamlines/synchrotons should allow for remote data processing and
even refinement. May be all program devs need to put benchmarks - will help
users greatly.

These days i have a feeling science copied the typical electron/website
framework programmers? Programs/website getting fatter not efficient and
hoping everyone has 128GB RAM.

Markus


> Cheerio Graeme
>
>
>
> > On 30 Nov 2018, at 19:32, James Holton <
> 270165b9f4cf-dmarc-requ...@jiscmail.ac.uk> wrote:
> >
> > I have a dissenting opinion about computers "moving on a bit".  At least
> when it comes to most crystallography software.
> >
> > Back in the late 20th century I defined some benchmarks for common
> crystallographic programs with the aim of deciding which hardware to buy.
> By about 2003 the champion of my refmac benchmark (
> https://bl831.als.lbl.gov/~jamesh/benchmarks/index.html#refmac) was the
> new (at the time) AMD "Opteron" at 1.4 GHz.  That ran in 74 seconds.
> >
> > Last year, I bought a rather expensive 4-socket Intel Xeon E7-8870 v3
> (turbos to 3.0 GHz), which is the current champion of my XDS benchmark.
> The same old refmac benchmark on this new machine, however, runs in 68.6
> seconds.  Only a smidge faster than that old Opteron (which I threw away
> years ago).
> >
> > The Xeon X5550 in consideration here takes 74.1 seconds to run this same
> refmac benchmark, so price/performance wise I'd say that's not such a bad
> deal.
> >
> > The fastest time I have for refmac to date is 41.4 seconds on a Xeon
> W-2155, but if you scale by GHz you can see this is mostly due to its fast
> clock speed (turbo to 4.5 GHz). With a few notable exceptions like XDS,
> HKL2k and shelx, which are multi-processing and optimized to take advantage
> of the latest processor features using intel compilers, most
> crystallographic software is either written in Python or compiled with
> gcc.  In both these cases you end up with performance pretty much scaling
> with GHz.  And GHz is heat.
> >
> > Admittedly, the correlation is not perfect, and software has changed a
> wee bit over the years, so comparisons across the decades are not exactly
> fair, but the lesson I have learned from all my benchmarking is that
> single-core raw performance has not changed much in the last ~10 years or
> so.  Almost all the speed increase we have seen has come from
> parallelization.
> >
> > And one should not be too quick to dismiss clusters in favor of a single
> box with a high core count. The latter can be held back by memory
> contention and other hard-to-diagnose problems.  Even with parallel
> execution many crystallography programs don't get any faster beyond using
> about 8-10 cores.  Don't let 100% utilization fool you!  Use a timer and
> you'll see.  I'm not really sure why that is, but it is the reason that
> same Xeon W-2155 that leads my refmac benchmark is also my champion system
> for running DIALS and phenix.refine.
> >
> > My two cents,
> >
> > -James Holton
> > MAD Scientist
> >
> >
> > On 11/26/2018 1:10 AM, V F wrote:
> >> Dear all,
> >> Thanks for all the off/list replies.
> >>
> >>> To be honest, how much are they paying you to take it? Can you sell it
> for
> >>> scrap?
> >> May be I will give it a pass.
> >>
> >>> To compare, two dual CPU servers with Skylake Gold 6148 - that is 40
> cores -
> >>> will probably beat the whole lot even if you could keep the cluster
> going.
> >>> And keeping clusters busy is a time consuming challenge... I know!
> >>> If they are 250W servers, then you are looking at £8000 per year to
> power
> >>> and cool it. The two modern servers will be more like £1500 per year
> to run.
> >>> And the servers will only cost 

Re: [ccp4bb] Long term storage for raw images/ crystallographic data sets

2018-12-02 Thread Markus Heckmann
Hi Raquel,
Are u using a compressed filesystem? I recently moved everything including
/home directory to ZFS - which gave ~   1.4X compression for old adsc
images. Remember vaguely, years before, James suggested to use
aufs/unionfs. You could even enable data-deduplication to save redundant
images.

In additon to James' suggestion of amazon glacier i would recommend
'backblaze'. I mainly use it for personal backup $50/year - but storage is
unlimited. You can also get a drive FedEx -ed for retrieval. But be warned
thst the GUI sucks.

One could go for the business plan - clean commandline API based
upload/download ~ about 350 per year.

If anyone is interested 'backblaze' produce fantastic harddrive statistics.

https://www.backblaze.com/blog/2018-hard-drive-failure-rates/

Markus


On Friday, November 30, 2018, James Holton <
270165b9f4cf-dmarc-requ...@jiscmail.ac.uk> wrote:

> The answer depends a lot on what you mean by "long-term storage".  Do you
> want the data to be available all the time on a mountable volume?  Or is
> putting it away on a shelf OK?  Do you want the storage to be as
> bulletproof and worry-free as possible?  Or are you OK with the fate of
> your data being somewhat nebulous, like in "the cloud"?  The price points
> for all these things are very different.
>
> You can now buy a single 8 TB drive for $230 USD.  LTO6 tapes are
> currently at ~7 USD/TB.  Both of these are the current lowest price/TB for
> disk and tape.  Using the media, of course, generally requires attaching it
> to a server that costs ~$5k-$10k USD.  Amazon Glacier is free for uploads
> and essentially free for downloading it back as long as you don't want more
> than 1 GB per month.  The other extreme is a NetApp, where you just want a
> turnkey system that keeps your data as safe as possible, but is also really
> fast.
>
> What do I do?  I am currently deploying a RAID6 array of 8 TB drives for
> high-performance storage.  For archiving I used to use DVD-R, but that
> can't keep up with a Pilatus, so now I'm on LTO6 tapes for off-line
> backups.  I know tapes are famous as "write-only media", but so far over
> the last 10 years I haven't had any real trouble reading back an old LTO
> tape.  
>
> -James Holton
> MAD Scientist
>
>
> On 11/29/2018 12:54 PM, Lieberman, Raquel L wrote:
>
> Dear All,
>
> How do your labs handle long-term raw data backups? My lab is maxing out
> our 6TB RAID backup (with two off-site mirrors) so I am investigating our
> next long term solution. The vast majority of the data sets are published
> structures (i.e. processed data deposited in PDB) or redundant/unusable so
> immediate access is not anticipated, but the size of data sets is
> increasing quickly with time, so I am looking for a scalable-yet-affordable
> solution.
>
> Would be grateful for input into various options, e.g. bigger HD/RAIDs,
> cloud backup, tape, anything else.
>
> I will compile.
>
> Thank you,
>
> Raquel
> --
> Raquel L. Lieberman, Ph.D.
> Professor
> School of Chemistry and Biochemistry
> Georgia Institute of Technology
>
>
>
> --
>
> To unsubscribe from the CCP4BB list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CCP4BB=1
>
>
>
> --
>
> To unsubscribe from the CCP4BB list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CCP4BB=1
>



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CCP4BB=1


[ccp4bb] Polder or FEM

2018-11-23 Thread Markus Heckmann
Dear Pavel,

By coincidence today I was looking at a soak-dataset and got totally
confused with FEM (map) and Polder now.

I am working on a dataset that processed it at 2.7A (in summer with
XDS) and there was weak density for (OCA-Octanoic acid) when looked
with FEM. This week, I retried it with DIALS and got about 2.6A (for
CC_half 0.5).  I ran FEM, and there was*no* density at all at OCA.
After seeing this CCP4 answer, I ran Polder maps and it shows clear
density for OCA. Now I am really confused why and what should I infer?
Polder run also states "The polder map is likely to show the ligand."
CC(1,2): 0.5639
CC(1,3): 0.8722
CC(2,3): 0.6193


https://imagebin.ca/v/4NbuaSRUbc5t
FEM in violet and Polder in Green

Any guidance appreciated.
Markus



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CCP4BB=1


[ccp4bb] Assumptions on protein activity (again)

2018-10-12 Thread Markus Heckmann
Dear all
Thanks to for all the responses. I would continue my question for
getting more advice.

I expressed a dimeric multi-protein (600 KDa).  This protein was
purified (affinity chromatography) in 4 different common buffers and
run with the same SEC-buffer. We observe clear single size-exclusion
peak referring to the dimer. We then measure activity of these 4
different protein yields in a well established assay.

What we observe: Only one out of 4 has high activity and even one
sample has NO activity at all. In all cases, the HPLC-SEC-MALS signal
using 'protein sample from activity assay' confirms unequivocally  a
dimeric state.

The observation of dimeric peak without activity means that  the
protein have a state that is either mis-folded - either locally or
partially. Are there any *sensitive* methods to detect this behaviour
- minor structural changes?

Can circular-dichroism detect these changes? or any other methods for
a large multi-domain protein?

Many thanks,
Markus



Previous responses:
---
I would not say *active* in the case of an enzyme, but probably
*folded*. An enzyme may have many conformational states, some of which
may represent inactive states, which will not be distinguished with
gel filtration (because their hydrodynamic radii will be roughly the
same), unless the inactivation involved unfolding and aggregation of
the protein.

Is your enzyme pH sensitive? For example, if it has a histidine in the
active site and most of the buffer conditions you are testing are
below pH 6, you may be looking at a well folded protein that just
isn't active because you've protonated the active site residue. Or it
could be that the buffers you are testing are binding to your protein
and sterically interfering with your substrate? It doesn't mean that
your protein isn't folded or even inactive if you have just blocked
the binding site, merely inhibited. There could be all kinds of
reasons that changing buffers could change the activity of the protein
without unfolding the protein itself. Another example is that people
often use phosphate buffer in purification, but if the enzyme requires
a Mg, you could be inadvertently pulling that out of the enzyme by
using phosphate buffer (or using sulfate with an enzyme that requires
Ca, etc).
I'm sure it is possible that there are many enzymes in the PDB that
are clearly well folded (have good structures) that are not in their
fully active states due to the crystallisation conditions used to
obtain the crystals. We are usually capturing a single state of a
protein which usually has to be mobile to perform its enzymatic
function.
---
You can speak for yourself, but not for me. I do not assume activity
from a gel; that's what assays are for.Different buffers: it could be
you have a cofactor, perhaps a metal. The best practice is to document
what you do in your publications to the extent that a reader could
duplicate your results.
---
There are lots of examples in the PDB of incorrect structures. And a
single peak on SE doesnt guaruntee correctly folded protein. What were
the differences between the buffers? pH, ionic strength and additives
all matter for enzyme activity, and many buffers do bind to active
sites thus affecting activity (despite the general attempt to use
large molecules which are unlikely to bind in the cases of the Good
buffers). All that being said, the idea of a single, correctly folded
conformation of an enzyme/protein is an oversimplification used in
textbooks rather than the more complicated picture held by experts in
the field.
--
I believe the strong assumption in the community is that a clear
single peak of appropriate Mw is a clear indication of pure protein,
worthy intensive crystallization efforts. Whether it is active is
another question and this should be measured.For your analysis, it is
not important in which buffers the protein is not active, but whether
the protein you purified is active in the buffer (maybe without
precipitant) you used for crystallization.A single apo structure is
usually not enough to determine the catalytic mechanism of an enzyme,
you usually need some substrate-, transition state- product- (analog)
structures as well. If your protein is active in the crystallization
buffer and the ligand complexes make chemical sense, you can be pretty
sure that you have crystallized the right conformation.If your protein
is not active in the crystallization buffer, you must critically
analyze the structure, if it makes chemical sense and if you can
explain the absence of activity (e.g. pH far from optimum; inhibitor
bound in the active site). I am currently working on an enzyme who's
active site loves all kinds of substituted and unsubstituted
phosphates, sulfates etc. so it is not active in a wide range of
buffers like phosphate, MES, MOPS, HEPES etc. However, the crystal
structures still 

[ccp4bb] Assumptions on protein purification

2018-10-11 Thread Markus Heckmann
Dear all,
Not directly a ccp4 question.
I am working on a multi-domain protein with multiple catalytic
centres. Purifying in gel-filtration (Äkta) with different buffers
gives a clear distinct peak indicating pure protein. We could even
crystallise it and determine 3D structure to about 2.5 A.

Why is there a strong assumption in the community (or at least in my
limited experience) that a clear single peak of appropriate Mw is
indication of *active and folded* protein that upon
crystallisation/structure determination can describe the working of
the enzyme?

When I tested the activity of the protein using assays, I found 3 out
of 4 buffers give very poor product turnover?  Could we discount the
possibility that some 3D-structures in PDB are inactive (differently
folded) and hence may not represent active state? Are there any best
practices? Or is this dependent on the protein, especially
multi-domain proteins show such weird behaviour?

Thanks for your comments,
Markus



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CCP4BB=1


[ccp4bb] size exclusion columns

2018-04-26 Thread Markus Heckmann
Dear all,

We are looking for a size exclusion chromatography column
(silica-based) for protein purification prior to a MALS-detector. We
looked for (www.waters.com BEH-450), Sepax (Unix-C 300) and Phenomex
(BioZen SEC-3).  Any 'column' tips or recommendations when dealing
with large proteins (MDa)?

Many thanks
Markus


[ccp4bb] unit cell is double between 2 forms

2017-11-09 Thread Markus Heckmann
Dear all,
we crystallized a small protein, that gives crystals P2 with cell
Cell 53.16   65.73   72.8990  110.94  90
(has 3 molecules in the asymmetric unit). Tested with pointless. Does
not give any other possibility.

The other crystal form of the same protein, similar conditions:
C2
Cell 109.14  124.37   73.4290  111.75  90. This has 6
molecules in the a.s.u. Tested with pointless. Does not give any other
possibility.
The cell length a, b of C2 is twice that of P2.

Is it usual to get such crystals from similar conditions or am I
missing something?

Many thanks,
Mark


[ccp4bb] double cell dimensions between P2 and C2

2017-11-09 Thread Markus Heckmann
Dear all,
>From a small protein, gives crystals P2 with cell
Cell 53.16   65.73   72.8990  110.94  90
(has 3 molecules in the asymmetric unit). Tested with pointless. Does
not give any other possibility.

Another crystal if the same protein, similar conditions:
C2
Cell 109.14  124.37   73.4290  111.75  90. This has 6
molecules in the a.s.u. Tested with pointless. Does not give any other
possibility.
The cell length a, b of C2 is twice that of P2.

Is it usual to get such crystals from similar conditions or am I
missing something?

Many thanks,
Mark


[ccp4bb] ccp4 website not secure

2017-10-11 Thread Markus Heckmann
If anyone from CCP4 website has noticed that...
Your connection is not secure


The owner of www.ccp4.ac.uk has configured their web site improperly.
To protect your information from being stolen, Firefox has not
connected to this web site.


(https warning message)
https://www.ccp4.ac.uk/ccp4online/


[ccp4bb] to fix angle between ligand and residue atoms

2017-07-03 Thread Markus Heckmann
Dear ccp4-ers,
I have a data set at 2.8 A. There is clear *continuous*  density for a
ligand + residue. I used the LINK record to connect the S-gamma-atom
from Cys to C1-atom of ligand OCA. My biochemistry collaborator says
that CB-SG-C1-O1 should be approximately planar. However, during
refinement with REFMAC5, the connecting region does lose planarity.
The ligand and residue does indeed stay properly inside the density. I
assume this is due to the low(er) resolution. Should I fix the
planarity or should I leave it to the refinement?
Thank you all,
Markus


Re: [ccp4bb] Electronic Laboratory Notebook

2017-05-29 Thread Markus Heckmann
Hi Sebastiano,
I posted the same question recently (in April). Just removed the names
and email ids for confidentiality (as some needs that).

 SUMMARY -

I think biovia (aka accelrys) have a version, although I’ve never used
it, so I don’t know how good it is. Blair Johnston (Strathclyde)
recently mentioned SOMETHING about biovia ELNs in regards to a version
they are using with the continuous crystal growth people, but I don’t
know if it’s the same one (?)

Have a look here:
http://accelrys.com/products/unified-lab-management/biovia-electronic-lab-notebooks/





Lab archives: http://www.labarchives.com/ We like it a lot.




What sort of data / experiments do you want to store in the ELN?
What do you want to connect it to (if anything)?

Assuming that as you posted to CCP4, I assume that you are thinking of
protein production experiments or crystallography of some form?

We are currently using Biovia Notebook (available on premise (which is
what we have) or hosted I believe) for chemistry, biology and
structural biology (protein production / crystallography)
We are moving to Dotmatics (www.dotmatics.com) for various reasons
(available hosted or on premise).  They have a nice protein production
module but nothing specific for crystallography yet, however their ELN
is dynamic enough that something simple could be configured.

Happy to discuss (off the CCP4 board) if you want more information.


STATEMENT OF CONFIDENTIALITY.

This email and any attachments may contain confidential, proprietary,
privileged and/or private information.
If received in error, please notify us immediately by reply email and
then delete this email and any attachments from your system. Thank
you!




Hi,
this is what we've been using for 10 years or so, and we are happy with it:
http://accelrys.com/micro/notebook-cloud/
Academic licence is appr. 100 USD/user/year.







Markus,

We use labguru, with only a modest amount of enthusiasm on the
part of my lab.  You can get it hosted on your own servers, and then
it will meet EU data protection standards.  We think it is sort of OK,
but not much better than sort of




Dear Markus,


I have been using Labguru with my lab for the past 18 months or so. I
have found that it is a good solution for many issues. It has its own
foibles that are annoying at times, but no more than anything else!
The key advantages are that it keeps electronic data well paired with
the experimental details, and under the control of the PI; and that
protocols and data are really easy to share between team members; the
disadvantage is that team members need at least a little time to get
used to working with an electronic system, and many people seem very
wedded to paper.

When I was looking for this solution, it seemed that most of the
quality suppliers were proprietary. I suspect that the cost for
proprietary is probably not much more than the personnel costs of
trying to get a free solution to work if you have to host it (some
parts of my university are also pretty unhappy about me using an
externally hosted solution). The conversations that I had in the UK
(admittedly, a couple of years ago) suggested that the suppliers would
also want a significant up-front cost to help set up the hosted
solution - so it made sense to use externally hosted unless a whole
department pretty much was going to move over to ELNs.

As a PI, I would not want to move back to paper notebooks now if I
could avoid it.


Hope this helps,




Dear Markus,

For the last years I have been using LabArchives, since I got an
account for "free" with my Graphpad Prism license. I've been mostly
happy with it. It can automatically upload attachments from a watched
folder, and integrates with some useful software such as Prism and
chemdoodle.
Labfolder also appears to be making progress. Both these have
android-apps that in my opinion increase the likelihood of them
actually being used.
I end up mostly using the notebook for plain text, whereas I would
probably do more free-hand scetches and tables if I had a paper
logbook.



Hi Markus,

Only an option for Mac users but I use Findings which is cheap, stored
locally and can be made to do most of what I need in a lab notebook.








We have this elab developped by an engineer in the Curie Institute
which is free and quite nice:
https://www.elabftw.net/












On Fri, May 19, 2017 at 11:01 AM, Sebastiano Pasqualato
 wrote:
>
> Dear all,
> another enquiry for the great community!
>
> We are considering the idea of moving to electronic laboratory notebooks
> rather than paper ones.
>
> Are you happily using one and would warmly suggest its implementation?
> Our department does not only deal with biochemical experiments, but performs
> a lot of genomics (big data analysis) and mouse genetics experiments, so
> experience of Notebooks used in departments that also have those activities
> would be a plus.
> We will consider free and paid softwares, 

[ccp4bb] add ligand solution onto drop directly SUMMARY

2017-01-26 Thread Markus Heckmann
Thanks for the responses. Yes, my crystal did survive and I can see
density for my ligand.

Summary:
1. If the crystal survives, it’s fine. These are just different ways
of exposing the crystal to the ligand and do what works. The only
issue will be if you don’t see electron density for the ligand in your
structure. If you just add ligand to the drop then there is plenty of
other precipitate/stuff that could non-specifically bind to the ligand
and reduce the amount available to bind the protein in the crystal. So
if you don’t see density, it would still be worth soaking a “clean”
crystal in the ligand/precipitant solution.

2.  It is OK as long as your crystal survives. I do this regularly, or
even just add dry compound to the drops directly, usually after adding
more reservoir to make the drop a bit bigger. It seems to work fine
for some compounds, not for others. It is very empirical.

3. The truth is in the map. Ergo, zap it and rationalize why it worked
or not later.


[ccp4bb] add ligand solution onto drop directly

2017-01-24 Thread Markus Heckmann
Dear all,
I wondered if it is OK to pipette ligand soln (X-CoA) *directly* to the
drop with crystal (1:1 ratio of protein:precipitant 2µl) instead of
dissolving it in precipitant solution and transferring the crystal to this
ligand containing precipitant solution. The crystals survive this as I add
the ligand solution to the edge of the drop and gently mix the two
solutions. Since I collected my datasets I wonder if it is OK?

Many thanks,
Markus