Re: [ccp4bb] Rescale merged data?

2024-04-17 Thread Matt Mcleod
Thanks Doeke (and others who have replied directly).  I suppose my big
concern with just truncating the data after merging would be the bin
size/data processing statistics.  If the edge/corner is binned 10x and I
end up removing 5 of the bins by truncating the resolution post-merging,
the Table 1 reported merged statistics will be based on the entire detector
face being scaled rather than if I reprocessed the truncated dataset into
10 bins.  I thought maybe that phenix.table_one would fix this by just
using the 10% high-resolution reflections and recalculating the merging
statistics, but this only does so if you supply the unmerged data (which is
missing).  I guess the transparent thing to do if I do not reprocess the
data is to report the number of bins used and manually curate the
high-resolution statistics in Table 1 based on the last bin (at the
resolution limit used in refinement or applied with CAD) in scale.log file
from HKL2000.

Just want to say thanks again to this community, it's great to get
insight in these problems so easily and quickly.
Matt

On Wed, 17 Apr 2024 at 16:35, Hekstra, Doeke Romke <
doeke_heks...@harvard.edu> wrote:

> Hi Matt,
>
>
>
> I appreciate disagreement and comments from colleagues. My two cents are
> that it seems unnecessary to repeat scaling and merging, or any earlier
> step. If you want to remove structure factor amplitudes or merged
> intensities from the MTZ file you can do so using MTZUTILS or similar
> functionality in CCP4 (
> https://www.ccp4.ac.uk/html/mtzutils.html#generalresolution). For
> refinement, you can specify the desired resolution range in your favorite
> refinement program.
>
>
>
> My personal convention is to use CC1/2 = 0.30 as the point to which retain
> data and  = 2 as the nominal resolution of the dataset. If you have
> the HKL2000 scaling log, you should be able to retrieve this information. I
> frankly wish we’d just deposit all data in the PDB rather than truncate
> based on some criterion or another.
>
>
>
> Best, Doeke
>
>
>
> *From:* Matt Mcleod 
> *Sent:* Wednesday, April 17, 2024 4:12 PM
> *To:* Hekstra, Doeke Romke 
> *Cc:* CCP4BB@JISCMAIL.AC.UK
> *Subject:* Re: [ccp4bb] Rescale merged data?
>
>
>
> Sure thing.
>
>
>
> A former student left somewhere between 30-50 datasets but they scaled the
> data to the detector corners (or maybe edge) in HKL2000.  There are many of
> the high-resolution bins with no reflections in them.  He then went forward
> and merged this data, presumably in HKL2000 again and did his model
> building/refinement.   We now need to re-refine the models against this
> data for publication but we need a more suitable resolution cutoff for
> the data.
>
>
>
> Rather than go back and index/integrate all the data and then rescale the
> data to a more appropriate place (then merge), I was wondering if there was
> a way to take the merged reflections as either .sca or .mtz (from
> scalepacktomtz output) and then rescale to a more appropriate resolution.
> It doesn't seem like the student left unmerged data.
>
>
>
> So, nothing fancy (aniostropy etc), there is just a lot of data that needs
> to be adjusted and I am trying to avoid reprocessing all the frames again.
>
>
>
> Matt
>
>
>
> On Wed, 17 Apr 2024 at 15:59, Hekstra, Doeke Romke <
> doeke_heks...@harvard.edu> wrote:
>
> Hi Matt,
>
> It would be helpful if you could describe your case in more detail. Do you
> want to change the resolution cutoff after scaling? Do you want to keep
> more data? Fewer? Or do you mean something different such as truncation to
> generate amplitudes, application of anisotropic resolution cutoffs,  or
> outlier rejection? Are you referring to data that were scaled in HKL2000?
>
> Best, Doeke
>
> -Original Message-
> From: CCP4 bulletin board  On Behalf Of Matt McLeod
> Sent: Wednesday, April 17, 2024 3:04 PM
> To: CCP4BB@JISCMAIL.AC.UK
> Subject: [ccp4bb] Rescale merged data?
>
> Hi all,
>
> I am looking at a old students data and it looks like they didn't properly
> cut off the data during scaling.  All of the files I have appear to be the
> merged .sca (or mtz after converting with scalepacktomtz) - is there a way
> to retruncate the data after merging or do I have to reprocess the data?
>
> Thanks,
>
> 
>
> To unsubscribe from the CCP4BB list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.jiscmail.ac.uk_cgi-2Dbin_WA-2DJISC.exe-3FSUBED1-3DCCP4BB-26A-3D1=DwMFaQ=WO-RGvefibhHBZq3fL85hQ=DPfbtiuJFZDsaMqRC3wABRuP0iJZMKOIsdAAHocfxcg=hJXGz_3_uI3SAKeW8GEqS

Re: [ccp4bb] Rescale merged data?

2024-04-17 Thread Matt Mcleod
Sure thing.

A former student left somewhere between 30-50 datasets but they scaled the
data to the detector corners (or maybe edge) in HKL2000.  There are many of
the high-resolution bins with no reflections in them.  He then went forward
and merged this data, presumably in HKL2000 again and did his model
building/refinement.   We now need to re-refine the models against this
data for publication but we need a more suitable resolution cutoff for
the data.

Rather than go back and index/integrate all the data and then rescale the
data to a more appropriate place (then merge), I was wondering if there was
a way to take the merged reflections as either .sca or .mtz (from
scalepacktomtz output) and then rescale to a more appropriate resolution.
It doesn't seem like the student left unmerged data.

So, nothing fancy (aniostropy etc), there is just a lot of data that needs
to be adjusted and I am trying to avoid reprocessing all the frames again.

Matt

On Wed, 17 Apr 2024 at 15:59, Hekstra, Doeke Romke <
doeke_heks...@harvard.edu> wrote:

> Hi Matt,
>
> It would be helpful if you could describe your case in more detail. Do you
> want to change the resolution cutoff after scaling? Do you want to keep
> more data? Fewer? Or do you mean something different such as truncation to
> generate amplitudes, application of anisotropic resolution cutoffs,  or
> outlier rejection? Are you referring to data that were scaled in HKL2000?
>
> Best, Doeke
>
> -Original Message-
> From: CCP4 bulletin board  On Behalf Of Matt McLeod
> Sent: Wednesday, April 17, 2024 3:04 PM
> To: CCP4BB@JISCMAIL.AC.UK
> Subject: [ccp4bb] Rescale merged data?
>
> Hi all,
>
> I am looking at a old students data and it looks like they didn't properly
> cut off the data during scaling.  All of the files I have appear to be the
> merged .sca (or mtz after converting with scalepacktomtz) - is there a way
> to retruncate the data after merging or do I have to reprocess the data?
>
> Thanks,
>
> 
>
> To unsubscribe from the CCP4BB list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1
>
> This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a
> mailing list hosted by www.jiscmail.ac.uk, terms & conditions are
> available at https://www.jiscmail.ac.uk/policyandsecurity/
>


-- 
*Matthew Jordan McLeod, PhD*
*Post-Doctoral Fellow - Cornell University*



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


[ccp4bb] Rescale merged data?

2024-04-17 Thread Matt McLeod
Hi all,

I am looking at a old students data and it looks like they didn't properly cut 
off the data during scaling.  All of the files I have appear to be the merged 
.sca (or mtz after converting with scalepacktomtz) - is there a way to 
retruncate the data after merging or do I have to reprocess the data?

Thanks,



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


[ccp4bb] Viewing samples under liquid nitrogen

2023-12-15 Thread Matt McLeod
Hi all,

Has any heard of or has ideas on how to view a sample after being plunged in 
liquid nitrogen but while still under LN2?  
Thanks!



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


[ccp4bb] Occupany refinement protocol amd best practices

2023-11-22 Thread Matt McLeod
Hi all,

Im wondering about occupancy refinements - both what's going on under the hood 
and what are best practices.

In the example I have, there is a ligand found in two distinct, partially 
overlapping sites that can be modeled is some confidence, but likely there are 
very low occupancy additional poses that blurs the electron density. The 
modeled poses are known from prior work, so even though there is smearing we 
know the ligand is in the modeled conformation. After perturbing the crystal 
these I am trying to decide what the best approach is to get some sort of 
numerical occupancy value to describe the distribution.

1.  In Phenix, how is the occupancy number determined? Is there a real-space 
correlation between the experimental density and the model(weighted to occ) 
that is optimized?  How can this go wrong?  I fear that the smearing and 
heterogenous nature will through the refinement off (over or underfitting to 
periphery density rather than hyper-localized position of the model)

2.  Are there errors associated with the occupancy numbers?

3.  For my own testing, I did 5% increments and manually observed Fo-Fc and 
2Fo-Fc maps and selected a value that resulted in the lowest amount of both 
positive and negative Fo-Fc peaks.  This is how we submitted the work to 
journal but reviewer wants it to be automatically calculated.  Is my approacg 
problematic besides the subjective nature of me determining what is a minimized 
peaks?  Is this less so reliable to automated fitting?  Will this really make a 
difference?  To be most honest, how should I let phenix approximate these 
values?  Set B-factor to average protein b?  Refine both simultaneously?  Or?

Any insight would be appreciated.
Matt



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


Re: [ccp4bb] Occupancy Refinement limitation

2023-09-06 Thread Matt Mcleod
Hi,

I should be a bit more specific.

We have many crystal structures to indicate that a loop adopts conformation
A and conformation B, or - the loop can be disordered where the electron
density is washed out.  These states are dependent on how we perturb the
system.

We have a series of data, as a function of osmolyte concentration, that
adjusts this equilibrium and we want to put a numerical value onto it.
Resolution is between 1.7 - 2.0 A. This is non-anomalous data but certainly
would be helpful in the future to include if possible some anomalous
scatters. Have been using phenix.refine.

So what we have done is modeled both conformation A and B and are refining
the data in order to get the occupancies (fixing B-factors) regardless of
if there is electron density present for one of the two conformations since
in some cases (likely all) there will be a population of both A, B, and
disordered and the relative true occupancies will move.  I am trying to
sort out how accurate these occupancy values are as opposed to showing the
electron density for each conformation fit at some common threshold.  My
general sense is that if it is modeled but no electron density, there will
be a non-zero value and vice versa if it is the only conformation present
it will be less than unity.

I will take a look at the references!  Very much appreciated.
Matt

On Wed, 6 Sept 2023 at 02:08, Eleanor Dodson <
176a9d5ebad7-dmarc-requ...@jiscmail.ac.uk> wrote:

> Well - occupancy refinement is particularly imprecise, and highly
> correlated with temperature factors.
> Also the population for a surface ARG or LYS may well have more than two
> conformations, whereas some internal residue is better defined.
> There is also the Q of solvent - dual occupancies will generate dual
> solvent networks..
>
> ..You dont say what resolution your data are. Again at 0.8A you can be
> confident - at 3A it is at best a guess. So I for one do not take the
> numbers very seriously - they are a flag only.
>
> Presumably you didnt model the second conformation unless there was some
> feature in an earlier map to suggest it existed?
> Good luck Eleanor
>
>
> On Wed, 6 Sept 2023 at 03:18, Pavel Afonine  wrote:
>
>> Hi Matt,
>> I believe figure 3 here:
>> https://www.nature.com/articles/s41467-018-06957-w
>> is relevant to your question.
>> Pavel
>>
>>
>> On Tue, Sep 5, 2023 at 11:32 AM Matt McLeod  wrote:
>>
>>> Hi all,
>>>
>>> I am trying to get some insight in the accuracy/precision of occupancy
>>> refinements.  I have done some 2-state occupancy refinements and have
>>> observed the refinement achieving ~0.25-0.3 occupancy for the minor
>>> population.  This population, when observing the electron density maps, had
>>> essentially no evidence for it being present.  I was wondering:
>>>
>>> What are the errors in the reported occupancies?
>>>
>>> Is there a lower and upper limit to occupancy refinements?  As in, if
>>> you occupancy refine two states and one is imaginary will it refine to
>>> approximately 1 and 0?  Or does the background noise always given a
>>> positive number to the imaginary set?  This would, to me at least, be the
>>> lower and upper limits to the occupancy refinements and could be used as a
>>> normalization factor for other atoms.  Maybe my logic is off...
>>>
>>> Any insight or literature would be appreciated!
>>> Matt
>>>
>>> 
>>>
>>> To unsubscribe from the CCP4BB list, click the following link:
>>> https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1
>>>
>>> This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a
>>> mailing list hosted by www.jiscmail.ac.uk, terms & conditions are
>>> available at https://www.jiscmail.ac.uk/policyandsecurity/
>>>
>>
>> --
>>
>> To unsubscribe from the CCP4BB list, click the following link:
>> https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1
>>
>
> --
>
> To unsubscribe from the CCP4BB list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1
>


-- 
*Matthew Jordan McLeod, PhD*
*Post-Doctoral Fellow - Cornell University*



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


[ccp4bb] Occupancy Refinement limitation

2023-09-05 Thread Matt McLeod
Hi all,

I am trying to get some insight in the accuracy/precision of occupancy 
refinements.  I have done some 2-state occupancy refinements and have observed 
the refinement achieving ~0.25-0.3 occupancy for the minor population.  This 
population, when observing the electron density maps, had essentially no 
evidence for it being present.  I was wondering:

What are the errors in the reported occupancies?

Is there a lower and upper limit to occupancy refinements?  As in, if you 
occupancy refine two states and one is imaginary will it refine to 
approximately 1 and 0?  Or does the background noise always given a positive 
number to the imaginary set?  This would, to me at least, be the lower and 
upper limits to the occupancy refinements and could be used as a normalization 
factor for other atoms.  Maybe my logic is off...

Any insight or literature would be appreciated!
Matt



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


[ccp4bb] Determining Second Lattice

2023-08-30 Thread Matt McLeod
Hi all,

I have a lot of large datasets that I want to screen to determine if there are 
one or two lattices in the diffraction.  I was wondering if there was a simple 
and quick way to do so.

Currently, I am processing with DIALS and getting to the indexing where the 
percent indexed indicates if there is potentially a second lattice - and then 
visually inspected when there is a significant number of rejection.

I have autoprocess log files ie aimless.log, autoindex.log, fast_dp.log that 
were generated at the beamline but I cannot see a similar metric suggesting 
second lattices. 

Any insight would be appreciated!
Matt



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


[ccp4bb] Renumber residues working PDB file - Applying a sequences numbering to PDB file

2023-02-04 Thread Matt McLeod
Hi all,

I have been refining a structure and somehow along the way the residue numbers 
have completely shifted.  For instance, the first section of residues are 
shifted by say 8 numbers, then there is a gap from where the resnumbers go from 
121, 151, 152, 153...and so on.  Its quite the mess.

Is there a way to take a sequence file with residue numbers correct and apply 
these to the PDB file?  I have tried Renumber Residues in coot but this isnt 
working, it just shifts some of them and does opposite shifts elsewhere since 
its so discontinuous. Align and mutate just shifts them incorrectly from the 
inputted sequence without applying the sequence file number to the PDB.

Any suggestions would be appreciated before I go and do this all manually...
Matt



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


Re: [ccp4bb] PAIREF - Warning - not enough free reflections in resolution bin

2022-10-01 Thread Matt McLeod
Hey everyone,

Thanks for all the suggestions there are a few different things I can try now.  
The data is very aniosotropic (STARANISO might help) in regards to how the 
crystal diffracts and I think changing the bin size will help specifically with 
PAIREF (its an warning so it completes the run).   I collected the data using 
oil and at room temperature using a vector scan so there are also differences 
in data quality through the collection (not too severe based on data 
processing), radiation damage, a changing background from oil, etc.  

However, diagnosing the problem further it seems that merging with AIMLESS 
throws a lot of my high resolution reflections out...like alot.  This explains 
why truncating the data doesnt change the maps and explains why my table 1 
statistics for high resolution bin are dismal.  I can supply log files when I 
find them.  Now I have to determine if the outlier rejections are useful or not 
and why DIALS processing didn't flag these as rejections.

I have yet to look into AIMLESS rejection outlier protocol but I would guess 
that the reflections are real at high resolution but there are not that many of 
them and they are not that redundant and therefore are being tossed.

Matt



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


[ccp4bb] PAIREF - Warning - not enough free reflections in resolution bin

2022-09-30 Thread Matt McLeod
Hi all,

I am having a bit of trouble with PAIREF.  I am trying to determine the 
resolution cutoff of a dataset which has a high Rwork (0.223)/free (0.275)value 
for the resolution (2.24A).  I have truncated this data further to 2.4A and the 
R values get better but the electron density maps do not improve whatsoever.  
When I put this data into PAIREF I get this message. 

WARNING: There are only 40 < 50 free reflections in the resolution shell 
2.75-2.70 A. Values of statistics Rfree and CCfree in this shell could be 
misleading. Consider setting thicker resolution shells.

This error occurs for all resolution bins.  Looking at the pdb log of the 
refined structure (where the data looks quite reasonable, maybe not a 2.3 
dataset but not much worse). 

REMARK   3  FIT TO DATA USED IN REFINEMENT (IN BINS).
REMARK   3   BIN  RESOLUTION RANGE  COMPL.NWORK NFREE   RWORK  RFREE  
CCWORK CCFREE
REMARK   3 1   60.08 -4.281.00 4135   213  0.1646 0.2213   
0.940  0.901
REMARK   3 24.28 -3.401.00 4095   201  0.1951 0.2672   
0.928  0.857
REMARK   3 33.40 -2.971.00 4099   204  0.2736 0.3303   
0.861  0.739
REMARK   3 42.97 -2.701.00 4026   212  0.3486 0.3508   
0.726  0.551
REMARK   3 52.70 -2.510.78 3178   154  0.4027 0.4062   
0.637  0.594
REMARK   3 62.50 -2.360.29 115460  0.4118 0.3984   
0.647  0.547
REMARK   3 72.36 -2.240.01   29 1  0.3366 0.4089   
0.343  0.000

Where I assume NFree is the issue, but doesn't suggest that there arent enough 
reflections.  There needs to be a higher cutoff as the completeness goes to 
pot, but the scaling log from DIALS suggested it was at least a good starting 
point.

 Statistics by resolution bin:
 d_max  d_min   #obs  #uniq   mult.  %comp r_mrg   
r_measr_pim   cc1/2   cc_ano
118.81   5.97  10436   15776.62  99.94 797.622.80.110   
 0.1200.046   0.992*  -0.397
  5.97   4.74   9602   15496.20 100.00 491.710.00.172   
 0.1880.075   0.979*  -0.240
  4.74   4.14  10410   15286.81 100.00 513.510.30.191   
 0.2070.079   0.978*  -0.281
  4.14   3.76  10833   15516.98 100.00 324.7 6.90.238   
 0.2570.098   0.969*  -0.419
  3.76   3.49  10956   15267.18 100.00 207.5 4.60.259   
 0.2800.105   0.975*  -0.271
  3.49   3.29  10899   15467.05 100.00 133.4 3.10.272   
 0.2940.111   0.966*  -0.258
  3.29   3.12   9565   15286.26  99.93  86.9 2.00.314   
 0.3430.136   0.935*  -0.346
  3.12   2.99   9256   15196.09 100.00  57.1 1.30.354   
 0.3870.154   0.932*  -0.354
  2.99   2.87   9317   15246.11 100.00  43.6 1.00.403   
 0.4410.177   0.934*  -0.198
  2.87   2.77  10087   15496.51 100.00  30.9 0.70.455   
 0.4940.192   0.923*  -0.116
  2.77   2.69  10157   15246.66 100.00  26.4 0.60.503   
 0.5460.210   0.927*  -0.132
  2.69   2.61  10351   15416.72 100.00  19.3 0.40.613   
 0.6650.255   0.925*  -0.155
  2.61   2.54  10121   15046.73 100.00  15.5 0.40.735   
 0.7970.306   0.906*  -0.088
  2.54   2.48  10510   15396.83 100.00  12.4 0.30.932   
 1.0090.385   0.857*   0.006
  2.48   2.42  10432   15196.87 100.00  11.2 0.20.972   
 1.0520.399   0.829*  -0.111
  2.42   2.37  10596   15326.92 100.00  10.2 0.21.095   
 1.1840.447   0.844*  -0.083
  2.37   2.32  10577   15356.89 100.00   8.9 0.21.259   
 1.3610.514   0.825*  -0.082
  2.32   2.28  10217   15136.75 100.00   7.8 0.21.421   
 1.5400.587   0.725*  -0.056
  2.28   2.24   9702   15216.38 100.00   6.4 0.11.683   
 1.8340.721   0.600*  -0.001
  2.24   2.20   9551   15386.21 100.00   6.1 0.11.895   
 2.0710.826   0.603*  -0.041
118.64   2.20 203575  306636.64  99.99 142.0 3.30.222   
 0.2410.093   0.989*  -0.265



Does anyone have any insight into where PAIREF is getting hung up? I am welcome 
to any other suggestions with regards to handling this dataset to determine 
resolution cutoff.  I can also supply any other log files that may be useful

Matt



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


Re: [ccp4bb] Lower b-factors with increasing T

2022-09-08 Thread Matt McLeod
Hi all,

This is my first JISC post so I am still working on how to navigate this.  I 
have to say I feel like there is a much better way to have a forum on 
structural biology... I could imagine a discord server where we can post 
individual questions, have live chats, etc.  I'd be happy to set this up if 
people are interested.

The data collection was done on a single crystal, using vector scanning to 
minimize radiation damage, and using oil to prevent dehydration.

I have since gone back and re-processed the 313K dataset with an updated DIALS 
and the problem seems to have gone away (and gave me better scaling 
statistics)...my guess is that the scaling program is just improving some of 
these "bugs".   This high temp dataset now has the highest average B-factor.

Regardless, there does seem to be not much variation with the temperature at 
the modest temperatures which may be in line with the biological story but 
these values make much more sense now.  Thanks everyone for all the suggestions 
with regards to scaling the data together, SCALEIT, etc.  I have a few 
different options to interpret the data now.

Matt



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


Re: [ccp4bb] Lower b-factors with increasing T

2022-09-07 Thread Matt McLeod
In addition, I computed the wilson B.s

253 - 41
273 - 35.4
293 - 36.5
313 - 0.19

Looks like there is definitely an issue with the data scaling.  Still looking 
for suggestions as to what to tweak.

Matt



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/


[ccp4bb] Lower b-factors with increasing T

2022-09-07 Thread Matt McLeod
Hi everyone,

I have a series of datasets at 253K (~2.0A), 273K (2.0A), 293K (2.0A), 313K 
(2.2A) and I am curious as to the details in determining B-factors.

I have treated these datasets more-or-less identically for comparison's sake.  
I used DIALS to index, integrate, and scale the data.  I scaled the data to a 
~0.6 CC1/2 cutoff.  

After fully refining the datasets, there is an odd trend with respect to 
temperature (from what has been previously published) and I assume that this is 
because of "behind-the-scenes" computation rather than a biophysical 
observation.  The B-factors slightly decrease from 252-293K, and then 
significantly drop at 313K.  The maps look pretty well identical across the 
datasets.

253K - 53.8 A^2
273K - 48.4 A^2
293K - 45.5 A^2
313K - 18.6 A^2

I compared the wilson intensity plots from DIALS scaling for 273K and 313K and 
they are very comparable.

I am looking for suggestions as to where to look at how these b-factors are 
selected or how to validate that these B-factor are or are not accurate.  Also, 
any relevant literature would be welcomed.  From what I have read, there is a 
general trend that as T increase, the atoms have more thermal energy which 
raises the b-factors and this trend is universal when comparing datasets from 
different temperatures.

Thank you and happy to supply more information if that is helpful,
Matt



To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB=1

This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list 
hosted by www.jiscmail.ac.uk, terms & conditions are available at 
https://www.jiscmail.ac.uk/policyandsecurity/