Hello,

we resolved some issues with COM removal in the latest release (please see the patch notes for more details).
So it might be that you where affected by this before.

Please let us know if the issue shows up again.

Cheers

Paul

On 06/03/2020 18:58, Daniel Kozuch wrote:
Additional (good) news, the problem appears to be resolved in the 2020.1 update 
(at least for the membrane only system). I'll conduct some additional tests to 
see if there is any lingering problems.

Dan

-----Original Message-----
From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se 
<gromacs.org_gmx-users-boun...@maillist.sys.kth.se> On Behalf Of Justin Lemkul
Sent: Friday, March 6, 2020 11:02 AM
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] GMX 2020 - COMM Removal Issue



On 3/6/20 10:00 AM, Daniel Kozuch wrote:
[Somehow my response got put in a different thread - hopefully this
works]

Justin,

Thanks for your reply. I agree that some COM motion is normal.
However, this was a very short simulation (1 ns), so the size of the
drift (several nm) was unexpected. To test, I repeated the simulation
with GROMACS 2019.6 using all the same settings (but without the new
ones: GMX_GPU_DD_COMMS, GMX_GPU_PME_PP_COMMS, GMX_FORCE_UPDATE_DEFAULT_GPU), 
and I don't see the same drift.
Does a membrane system (no protein) also cause the same issue?

-Justin

Best,
Dan

-----Original Message-----
From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se
<gromacs.org_gmx-users-boun...@maillist.sys.kth.se> On Behalf Of
Justin Lemkul
Sent: Tuesday, March 3, 2020 3:02 PM
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] GMX 2020 - COMM Removal Issue



On 3/2/20 9:53 PM, Daniel Kozuch wrote:
Hello,

I am experimenting with GROMACS 2020. I have compiled the mpi
threaded version and am using the new settings (GMX_GPU_DD_COMMS,
GMX_GPU_PME_PP_COMMS, GMX_FORCE_UPDATE_DEFAULT_GPU) as suggested on
at the following link:
https://devblogs.nvidia.com/creating-faster-molecular-dynamics-simulations-with-gromacs-2020/.
I am running mdrun  with the suggested options:
"-pin on -nb gpu -bonded gpu -pme gpu -npme 1 -nstlist 400" on 4 GPUs
and
28 CPUs with "-ntmpi 4 -ntomp 7".

I am currently running a membrane system with a transmembrane protein
in water solvent. I am using the following settings for COM removal:

comm_mode  = linear
comm_grps  = PROT_MEMB SOL_ION

Here I choose to couple the the protein and the membrane from the
advice in this thread:
https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2016-Sep
t
ember/108584.html

Unfortunately, I still see a large drift in the z-dimension for the
entire membrane/protein group. Currently I have nstcalcenergy/nstcomm
set to 100, as decreasing them leads to poor performance. (Hopefully
it is unnecessary to set them to 1)
Removing artificial contributions to COM motion does not mean that the entities 
cannot diffuse over time. Depending on the length of your simulation, drift in 
absolute position can be perfectly normal.

-Justin

Does anyone have suggestions for avoiding the COM drift? I know this
issue has been discussed before
(https://redmine.gromacs.org/issues/2867) but it looks like it was
resolved in earlier GROMACS versions. As a note, I am using a CHARMM force 
field with CMAP dihedrals.


Here are some other potentially relevant mdp options (from CHARMM) in
case they help:

integrator              = md
dt                      = 0.002
nstcalcenergy           = 100
;
cutoff-scheme           = Verlet
nstlist                 = 20
rlist                   = 1.2
coulombtype             = pme
rcoulomb                = 1.2
vdwtype                 = Cut-off
vdw-modifier            = Force-switch
rvdw_switch             = 1.0
rvdw                    = 1.2
;
tcoupl                  = v-rescale
tc_grps                 = PROT_MEMB   SOL_ION
tau_t                   = 1.0    1.0
ref_t                   = 303.15 303.15
;
pcoupl                  = Berendsen
pcoupltype              = semiisotropic
tau_p                   = 5.0
compressibility         = 4.5e-5  4.5e-5
ref_p                   = 1.0     1.0
;
constraints             = h-bonds
constraint_algorithm    = LINCS
continuation            = yes

Best,
Dan
--
==================================================

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==================================================

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.
--
==================================================

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==================================================

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


--
Paul Bauer, PhD
GROMACS Development Manager
KTH Stockholm, SciLifeLab
0046737308594

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Reply via email to