>
> Here's your problem. You have pairs defined that are in excess of 12 nm,
> but they are assigned to a 1-4 interaction, so atoms that should be
> separated by three bonds. The user-defined potential shouldn't matter
> here unless you've added [pairs] to the topology.
>
> I see your point.
What c
in Hall
Lab: 303 Engel Hall
Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061
jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com
==============
--
Message: 3
Date: Sun, 1 Sep 2019 13:22:15 -040
ch pull coordinate
> >> With 1 pull groups, expect 2 columns (including the time column)
> >> Reading file umbrella71.tpr, VERSION 5.1.4 (single precision)
> >> Reading file umbrella98.tpr, VERSION 5.1.4 (single precision)
> >> Reading file umbrella111.t
On 9/1/19 5:44 AM, Avijeet Kulshrestha wrote:
Hi all,
I am running martini coarse-grained simulation with 15 fs of time step in
gromacs 2018.6. I have 25859 number of atoms and my box size is:
12.0 14.0 18.0
Where I have Protein, membrane (DPPC) and ions.
I have minimized energy
Hi all,
I am running martini coarse-grained simulation with 15 fs of time step in
gromacs 2018.6. I have 25859 number of atoms and my box size is:
12.0 14.0 18.0
Where I have Protein, membrane (DPPC) and ions.
I have minimized energy with 16 processor and -rdd option as 2.5. It worked
Hi Mark,
To my knowledge, she's not using CHARMM-related FF's at all -- I think she
is using Amber03 (Alyssa, correct me if I'm wrong). Visually and RSMD-wise
the trajectory looks totally normal, but is there something specific I
should be looking for in the trajectory, either visually or quantita
Hi,
What does the trajectory look like before it crashes?
We did recently fix a bug relevant to simulations using CHARMM switching
functions on GPUs, if that could be an explanation. We will probably put
out a new 2018 version with that fix next week (or so).
Mark
On Thu., 14 Feb. 2019, 20:26 M
Hi all,
My student is trying to do a fairly straightforward MD simulation -- a
protein complex in water with ions with *no* pull coordinate. It's on an
NVidia GPU-based machine and we're running gromacs 2018.3.
About 65 ns into the simulation, it dies with:
"an atom moved too far between two do
Hi,
The implicit solvent support got a bit broken between 4.5 and 4.6, and
nobody has yet worked out how to fix it, sorry. If you can run with 1 cpu,
do that. Otherwise, please use GROMACS 4.5.7.
Mark
On Mon, Jun 18, 2018 at 9:21 AM Chhaya Singh
wrote:
> I am running a simulation having protei
I am running a simulation having protein in implicit solvent using amber
ff99sb forcefield and gbsa as solvent .
I am not able to use more than one cpu.
It always gives domain decomposition error if i use more than one cpu.
when i tried running using one cpu then it gave me this error :
"Fatal erro
Check the trajectories before and after conversion and make sure that there
are no pbc effect, if so fix it.
Or do the analysis with the avaible trajectories(may be in VMD with tcl
scripts).
--
Regards,
Nikhil Maroli
--
Gromacs Users mailing list
* Please search the archive at
http://www.groma
Thanks for the reply Mark.
On Sat, Apr 28, 2018 at 4:32 PM, Mark Abraham
wrote:
> Hi,
>
> Clearly the conversion tool did not produce a file that conforms to the
> requirements GROMACS has for specifying periodic boxes. That may not work
> well even if you'd run mdrun without domain decompositio
Hi,
Clearly the conversion tool did not produce a file that conforms to the
requirements GROMACS has for specifying periodic boxes. That may not work
well even if you'd run mdrun without domain decomposition because the
periodicity may not be understood correctly. Find out what was going on and
ho
Yes.. I used VMD for conversion...
On Sat, Apr 28, 2018 at 12:50 PM, RAHUL SURESH
wrote:
> Hi
>
> Sounds strange to my little knowledge. How I would justify is, it may be
> due to the conversion from NAMD [dcd] to Gromacs profile [trr] though not
> sure.
>
> So you have converted the file format
Hi
Sounds strange to my little knowledge. How I would justify is, it may be
due to the conversion from NAMD [dcd] to Gromacs profile [trr] though not
sure.
So you have converted the file format using VMD?
On Sat, Apr 28, 2018 at 12:26 PM, Sahithya S Iyer wrote:
> Hi,
>
> Thanks for the reply
Hi,
Thanks for the reply. I am only doing a rerun of a trajectory that has
already evolved without any dynamic load balancing problems.
-rerun only recalculates energies right. I don't understand why the same
trajectory is giving decomposition error now.
On Sat, Apr 28, 2018 at 12:11 PM, RAHUL SU
Hi.
That indicates a problem with dynamic load balancing. Try to build
different sizes of the box.
On Sat, Apr 28, 2018 at 11:57 AM, Sahithya S Iyer wrote:
> Hi,
>
> I am trying to calculate interaction between specific residues using gmx
> mdrun -rerun flag. The trajectory was in a dcd format
Hi,
I am trying to calculate interaction between specific residues using gmx
mdrun -rerun flag. The trajectory was in a dcd format, which I converted to
a trr file. I get the following error -
Domain decomposition has not been implemented for box vectors that have
non-zero components in direction
Well, I do not do anything special when preparing this system compared to
other systems that do not show this issue.
I have carefuly inspected my system and I know what is wrong. I did some
manipulations to PDB file due to missing fragment of
residue and accidentally put NZ atom of Lysine like 3.5
On 4/15/18 9:29 AM, Dawid das wrote:
Dear Gromacs Users,
I run numerous MD simulations for similar systems of protein in water box
and
for only one system I encounter error:
*Fatal error:There is no domain decomposition for 4 ranks that is
compatible with the givenbox and a minimum cell s
Dear Gromacs Users,
I run numerous MD simulations for similar systems of protein in water box
and
for only one system I encounter error:
*Fatal error:There is no domain decomposition for 4 ranks that is
compatible with the givenbox and a minimum cell size of 3.54253 nmChange
the number of ran
On 5/18/17 5:59 AM, Kashif wrote:
I got this error every time when I try to simulate one of my protein-ligand
complex.
---
Program mdrun, VERSION 4.6.6
Source code file: /root/Documents/gromacs-4.6.6/src/mdlib/pme.c, line: 851
Fatal error:
I got this error every time when I try to simulate one of my protein-ligand
complex.
---
Program mdrun, VERSION 4.6.6
Source code file: /root/Documents/gromacs-4.6.6/src/mdlib/pme.c, line: 851
Fatal error:
1 particles communicated to PME node 5
Dear all gromacs users,
I have seen in mail archive this domain decomposition error can be avoided
with less number of processor, but how to find the suitable number of
processor required?
here is the log file.
https://drive.google.com/file/d/0Bzs8lO6WJxD9alRTYjFaMjBTT2c/
view?usp=sharing
--
Dear all gromacs users,
I have seen in mail archive this domain decomposition error can be avoided
with less number of processor, but how to find the suitable number of
processor required?
here is the log file.
https://drive.google.com/file/d/0Bzs8lO6WJxD9alRTYjFaMjBTT2c/view?usp=sharing
--
"Mark Abraham"
> To: gmx-us...@gromacs.org
> Sent: Tuesday, March 7, 2017 4:25:12 AM
> Subject: Re: [gmx-users] domain decomposition Error
>
> Hi,
>
> Exactly. NVT not exploding doesn't mean it's ready for NpT, particularly if
> the volume is just wro
raham"
To: gmx-us...@gromacs.org
Sent: Tuesday, March 7, 2017 4:25:12 AM
Subject: Re: [gmx-users] domain decomposition Error
Hi,
Exactly. NVT not exploding doesn't mean it's ready for NpT, particularly if
the volume is just wrong, or you try to use parrinello rahaman too soon.
Mark
llibration step only and not during
> the nvt equillibration step. I have successfully done 1-ns of nvt
> equillibration.
>
>
> --- -- Original Message -
> From: "Mark Abraham"
> To: gmx-us...@gromacs.org
> Sent: Tuesday, March 7, 2017 2:32:46 AM
> Subje
7, 2017 2:32:46 AM
Subject: Re: [gmx-users] domain decomposition Error
Hi,
There's good advice for this problem at think link that was suggested in
the error message: http://www.gromacs.org/Documentation/Errors. Probably
your box volume or NpT protocol need some attention.
Mark
On Tue, 7
Hi,
There's good advice for this problem at think link that was suggested in
the error message: http://www.gromacs.org/Documentation/Errors. Probably
your box volume or NpT protocol need some attention.
Mark
On Tue, 7 Mar 2017 06:23 shweta singh wrote:
> Thank you !
>
> On Tue, Mar 7, 2017 at
Thank you !
On Tue, Mar 7, 2017 at 9:47 AM, MRINAL ARANDHARA <
arandharamri...@iitkgp.ac.in> wrote:
> I am trying to run a lipid bilayer simulation but during the npt
> equillibration step I am getting the following error
> "1 particles communicated to PME rank 6 are more than 2/3 times the
> cut
I am trying to run a lipid bilayer simulation but during the npt equillibration
step I am getting the following error
"1 particles communicated to PME rank 6 are more than 2/3 times the cut-off out
of the domain decomposition cell of their charge group in dimension y"
I have successfully run the
I am trying to run a lipid bilayer simulation but during the npt equillibration
step I am getting the following error
"1 particles communicated to PME rank 6 are more than 2/3 times the cut-off out
of the domain decomposition cell of their charge group in dimension y"
I have successfully run the
Hi Qasim,
> On 12 Jan 2017, at 20:22, qasimp...@gmail.com wrote:
>
> Hi Carsten,
>
> I think I couldn't clearly explain the protocol that I follow. Sorry for
> that. Firstly, I do the EM, nvt (100 ps), npt (100 ps) and md (100 ns) steps
> for the equilibrium. In all those steps I use the below
Hi Carsten,
I think I couldn't clearly explain the protocol that I follow. Sorry for that.
Firstly, I do the EM, nvt (100 ps), npt (100 ps) and md (100 ns) steps for the
equilibrium. In all those steps I use the below free energy parameters for the
forward state:
free-energy = yes
init-lambda
Hi Qasim,
> On 11 Jan 2017, at 20:29, Qasim Pars wrote:
>
> Dear Carsten,
>
> Thanks. The forward state simulations works properly with mdrun -ntmpi 8
> -ntomp 2 or mdrun -ntmpi 4 -ntomp 4 as you suggested.
> For the backward state GROMACS still gives too many lincs warning error
> with those
Dear Carsten,
Thanks. The forward state simulations works properly with mdrun -ntmpi 8
-ntomp 2 or mdrun -ntmpi 4 -ntomp 4 as you suggested.
For the backward state GROMACS still gives too many lincs warning error
with those mdrun commands in the md step, indicating the system is far from
equilibr
Dear Qasim,
those kinds of domain decomposition 'errors' can happen when you
try to distibute an MD system among too many MPI ranks. There is
a minimum cell length for each domain decomposition cell in each
dimension, which depends on the chosen cutoff radii and possibly
other inter-atomic constr
Dear users,
I am trying to simulate a protein-ligand system including ~2 atoms with
waters using GROMACS-2016.1. The protocol I tried is forward state for the
free energy calculation. The best ligand pose used in the simulations was
got by AutoDock. At the beginning of the simulation GROMACS s
On Fri, Mar 18, 2016 at 7:47 AM, <
gromacs.org_gmx-users-requ...@maillist.sys.kth.se> wrote:
>
> Message: 4
> Date: Fri, 18 Mar 2016 07:46:48 -0400
> From: Justin Lemkul
> To: gmx-us...@gromacs.org
> Subject: Re: [gmx-users] Domain decomposition error tied to free
>
Hello,
I have been attempting to carry out some free energy calculations, but to
verify the sanity of my parameters, I decided to test them on a structure I
knew to be stable -- the lysozyme from Lemkul's lysozyme in water tutorial.
I chose the L75A mutation because it is out on the surface to mi
On 3/17/16 8:21 PM, Ryan Muraglia wrote:
Hello,
I have been attempting to carry out some free energy calculations, but to
verify the sanity of my parameters, I decided to test them on a structure I
knew to be stable -- the lysozyme from Lemkul's lysozyme in water tutorial.
I chose the L75A mu
I think i've just found my mistake. Thank you so much again.
Khatnaa
On Friday, 30 October 2015, 18:55, Justin Lemkul wrote:
On 10/30/15 7:09 AM, badamkhatan togoldor wrote:
> Thank you Justin.
>> The better question is why you're trying to decouple an entire protein; that
>>
On 10/30/15 7:09 AM, badamkhatan togoldor wrote:
Thank you Justin.
The better question is why you're trying to decouple an entire protein; that is
extremely impractical and unlikely to be useful.
Did i do that? then it's my mistake of less knowledge off that. How i fix that?
Khatnaa
Thank you Justin.
>The better question is why you're trying to decouple an entire protein; that
>is
>extremely impractical and unlikely to be useful.
Did i do that? then it's my mistake of less knowledge off that. How i fix that?
Khatnaa
On Friday, 30 October 2015, 1:14, Justin Le
On 10/29/15 4:56 AM, badamkhatan togoldor wrote:
Dear GMX Users, I am simulating a free energy of a protein chain_A in water
by parallel. Then i got domain decomposition error in mdrun. Will use 15
particle-particle and 9 PME only ranksThis is a guess, check the performance
at the end of the lo
Dear GMX Users,
I am simulating a free energy of a protein chain_A in water by parallel. Then i
got domain decomposition error in mdrun.
Will use 15 particle-particle and 9 PME only ranksThis is a guess, check the
performance at the end of the log file
---
Hi SMA,
It says you have bonds over large distances. Check the
structure/topology/setup.
Cheers,
Tsjerk
On Oct 27, 2015 08:02, "Musharaf Ali" wrote:
> Dear users
> During energy minimization for IL-water system in a box size of 4.7x4.7x9.4
> with432 BMIMTF2N and 3519 water molecules, the follo
Dear users
During energy minimization for IL-water system in a box size of 4.7x4.7x9.4
with432 BMIMTF2N and 3519 water molecules, the following error is written
in the md.log file.
Initializing Domain Decomposition on 144 nodes
Dynamic load balancing: no
Will sort the charge groups at every domain
Thank you Mark for the reply.
We are not sure about it either as it worked when we started the simulation
again using the cpt file and also there
was no issue when we did the same simulation using (links algorithm)
constraints.
Thanks,
Siva
On Jul 23, 2014, at 4:20 PM, Mark Abraham wrote:
>
Dear All,
I am running simulations of BMP2 protein and graphite sheet using implicit
solvent model (mdp file is pasted below). The graphite atoms are frozen in the
simulation and BMP2 is free to translate.
I got an error "Step 1786210: The domain decomposition grid has shifted too
much in the Z
On Mon, Jul 21, 2014 at 3:48 PM, Siva Dasetty wrote:
> Dear All,
>
> I am running simulations of BMP2 protein and graphite sheet using implicit
> solvent model (mdp file is pasted below). The graphite atoms are frozen in
> the simulation and BMP2 is free to translate.
> I got an error "Step 17862
Dear All,
I am running simulations of BMP2 protein and graphite sheet using implicit
solvent model (mdp file is pasted below). The graphite atoms are frozen in the
simulation and BMP2 is free to translate.
I got an error "Step 1786210: The domain decomposition grid has shifted too
much in the Z
Dear All,
I am running simulations of BMP2 protein and graphite sheet using implicit
solvent model (mdp file is pasted below). The graphite atoms are frozen in the
simulation and BMP2 is free to translate.
I got an error "Step 1786210: The domain decomposition grid has shifted too
much in the Z
Dear All,
I am running simulations of BMP2 protein and graphite sheet using
implicit solvent model (mdp file is pasted below). The graphite atoms
are frozen in the simulation and BMP2 is free to translate.
I got an error "Step 1786210: The domain decomposition grid has
shifted too much in the Z-di
55 matches
Mail list logo