[gmx-users] Re: choosing force field

2013-11-10 Thread pratibha
wrote: On 11/9/13 12:48 AM, pratibha wrote: Sorry for the previous mistake. Instead of 53a7, the force field which I used was 53a6. 53A6 is known to under-stabilize helices, so if a helix did not appear in a simulation using this force field, it is not definitive proof

[gmx-users] Re: choosing force field

2013-11-08 Thread pratibha
Sorry for the previous mistake. Instead of 53a7, the force field which I used was 53a6. On Fri, Nov 8, 2013 at 12:10 AM, Justin Lemkul [via GROMACS] ml-node+s5086n5012325...@n6.nabble.com wrote: On 11/7/13 12:14 PM, pratibha wrote: My protein contains metal ions which are parameterized

[gmx-users] Re: choosing force field

2013-11-07 Thread pratibha
My protein contains metal ions which are parameterized only in gromos force field. Since I am a newbie to MD simulations, it would be difficult for me to parameterize those myself. Can you please guide me as per my previous mail which out of the two simulations should I consider more

[gmx-users] choosing force field

2013-11-04 Thread pratibha kapoor
Dear all I would like to carry out unfolding simulations of my dimeric protein and would like to know which is the better force field to work with out of gromos 96 43 or 53? Also, is gromos 96 43a1 force field redundant? When I searched the previous archive, I could see similar question was

[gmx-users] parallelization

2013-10-17 Thread pratibha kapoor
Dear gromacs users I would like to run my simulations on all nodes(8) with full utilisation of all cores(2 each). I have compiled gromacs version 4.6.3 using both thread mpi and open mpi. I am using following command: mpirun -np 8 mdrun_mpi -v -s -nt 2 -s *.tpr -c *.gro But I am getting following

[gmx-users] g_sham

2013-10-14 Thread pratibha kapoor
Dear all gromacs users I am creating free energy landscape using g_sham but my axis are not getting labelled. I have searched the archive and found that using xmin and xmax options we can label them. I have first created my 2D projection xvg file using g_anaeig -f *.xtc -s *.tpr -first 1 -last 2

[gmx-users] parallel simulation

2013-10-07 Thread pratibha kapoor
I would like to run one simulation in parallel so that it utilises all the available nodes and cores. For that, I have compiled gromacs with mpi enabled and also installed openmpi on my machine. I am using the following command: mpirun -np 4 mdrun_mpi -v -s *.tpr When i use top command, I get:

[gmx-users] Re: parallel simulation

2013-10-07 Thread pratibha kapoor
To add : I am running simulations on institute cluster with 8 nodes (2 cores each). Please suggest me the way so that I can run one simulation on all available nodes, cores and threads. Thanks in advance. On Mon, Oct 7, 2013 at 1:55 PM, pratibha kapoor kapoorpratib...@gmail.comwrote: I would

[gmx-users] principal component analysis

2013-09-28 Thread pratibha kapoor
Dear all users I would like to calculate pc loadings for various integrated factors in the form of following sample table: Integrated Factors PC1 PC2 PC3 PC4 PC5 PC6 PC7 PC8 PC9 PC10 Total non polar surface area 0.60 -0.07 -0.76 -0.11 0.08 0.05 -0.16 0.08 -0.01 0.02 Native

[gmx-users] principal component analysis

2013-09-27 Thread pratibha kapoor
Dear all gmx users I would like to calculate pc loadings for various integrated factors in the form of following sample table: Integrated factorsPC1PC2PC3 PC4PC5PC6PC7PC8 PC9PC10Total nonpolar surface area0.60 -0.07-0.76-0.11-0.060.11 0.05-0.160.06-0.02Chain exposed area 0.92 -0.14-0.050.12