Re: [gmx-users] Coarse-grained Protein-ligand simulations

2019-04-01 Thread P C Kroon
Hi,

I work in privileged Europe, so it’s good for me to get a reality check once 
every while. Thanks.

Coarse graining molecules for Martini is not too hard. There should be some 
tutorials on cgmartini.nl that should help you get underway. You will, however, 
run into the problems I mentioned, and you will need to do extensive validation 
on the topologies of your ligands. Again, it depends on your exact research 
question: if you’re doing high-throughput like screening, qualitative models 
might be good enough. Also see T Bereau’s automartini.

Peter

From: Mac Kevin Braza
Sent: 01 April 2019 16:06
To: gmx-us...@gromacs.org
Cc: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: Re: [gmx-users] Coarse-grained Protein-ligand simulations

Dear Sir Peter Kroon,

We are currently maximizing the computer capabilities to reach microsecond,
but to reach 1 microsecond in our lab, it would take me at least 6 months
to finish all one microsecond.
We do not have that high level capacities here in the Philippines to reach
it. Membrane proteins are
typically longer, with all the lipid bilayers, solvent, and ions present on
top of the protein.
We will need more powerful computers in this part.

I found few works from literature on the protein-ligand representation in
Coarse-grained.
We found several papers but they are either have vague methodology in
describing the ligand coarse-graining method and/or not necessarily have
the same research problem
as we want to explore.

All in all, we will finish the simulation in all-atom as long as we can,
and still be hopeful with
the coarse-graining method. What we explored as in the present is the
CHARMM-GUI Martini Maker,
yet they do not include the drug ligands in representing them in
coarse-grained. I still have to search for other means
to do this. Thank you very much!

Best regards,
Mac Kevin E. Braza

On Mon, Apr 1, 2019 at 5:59 PM Peter Kroon  wrote:

> Hi,
>
> that's probably a tough cookie. My first instinct would be to just apply
> a more hardware, and do it all atomistically. A microsecond should be
> within reach. Whether it's enough is a separate matter. The problem is
> that most CG representations don't get the shape of both your pocket and
> ligand exactly right, producing unreliable answers. In addition, in most
> CG FFs hydrogen bonds are isotropic and not specific enough for this
> kind of problem.
>
> If "more hardware" is not an option you'll need to dive into literature
> to see if people did CG protein-ligand binding/docking/unbinding
> (depening on research question). I would also be very skeptical of any
> (absolute) kinetics produced by CG simulations.
>
> As a last ditch effort you could look into multiscaling, but that's a
> research topic in its own.
>
>
> Peter
>
>
> On 01-04-19 11:49, Mac Kevin Braza wrote:
> > Thank you Prof. Lemkul,
> >
> > I appreciate your comment on this part.
> >
> > Sir Peter Kroon,
> >
> > We want to do the coarse-grained MD simulation to access long timescale
> > events of the
> > effect of the ligand binding to the GPCR, at least microsecond . For now,
> > the most accessible means for us is to
> > do the CGMD. But we are currently being cornered in choosing which set-up
> > will best suit, and
> > if it will allow us to see these events. We are looking also in the
> > possibility of coarse-graining
> > the ligand, and if you can share your expertise in coarse-graining also
> the
> > ligand that would be great.
> > I appreciate this Sir Kroon, thank you very much!
> >
> > Best regards,
> > Mac Kevin E. Braza
> >
> > On Mon, Apr 1, 2019 at 5:07 PM Peter Kroon  wrote:
> >
> >> If I may chip in: It really depends on what you're studying, and what
> >> forcefield you're using to do it. Unfortunately there is no FF that
> >> reproduces all behaviour accurately. The art is in picking one that (at
> >> least) reproduces what you're interested in.
> >>
> >>
> >> Peter
> >>
> >> On 29-03-19 17:26, Justin Lemkul wrote:
> >>>
> >>> On 3/29/19 9:17 AM, Mac Kevin Braza wrote:
>  Thank you Professor Lemkul,
> 
>  But would you suggest on how can I coarse-grained the ligand I am
>  using? I
>  have been searching resources online but they do not work in our part.
> >>> I don't work with CG simulations, so I'm not much help. I would think
> >>> that a CG parametrization of a ligand would remove all the detail
> >>> you'd normally want to see in terms of ligand-protein interactions.
> >>>
> >>> -Justin
> >>>
>  I hope you can help us. Thank you Prof. Lemkul!
> 
>  Best regards,
>  Mac Kevin E. Braza
> 
>  On Fri, Mar 29, 2019, 8:59 PM Justin Lemkul  wrote:
> 
> > On 3/29/19 3:32 AM, Mac Kevin Braza wrote:
> >> Hello everyone,
> >>
> >> I am simulating a coarse-grained model of a membrane protein (GPCR)
> in
> >> lipid bilayer and an all-atom ligand octopamine. I build the
> protein,
> >> solutes, and membrane in the web server CHARMM-GUI. While, I added
> the
> 

Re: [gmx-users] How to set the inter-chain disulfind bond in martini?

2019-01-29 Thread P C Kroon
Hi Cheng,

What’s the distance between the two interchain SG atoms? You can either tune 
the distance used for autodetecting cys bonds, or just specify all of them 
using the command line. Try `python martinize.py -h`

Peter
PS. The cgmartini.nl forum is also back online, which might be a better medium 
for these martini specific questions.

From: ZHANG Cheng
Sent: 29 January 2019 16:08
To: gromacs.org_gmx-users
Subject: [gmx-users] How to set the inter-chain disulfind bond in martini?

Dear Martini friends,


My protein has 10 Cys with 5 disulfind bonds in light chain (LC) and heavy 
chain (HC):
) Interchain disulfide bond: LC214 – HC220
) Intrachain disulfide of LC: LC23 – LC88, LC134 – LC194
) Intrachain disulfide of HC: HC144 – HC200, HC96 – HC22


Using this command will only generate four intrachain disulfide bonds, without 
the interchain one:
$ python martinize.py -f protein.pdb -o system.top -x protein-CG.pdb -dssp 
./dssp-2.0.4-linux-amd64 -p backbone -ff martini22 -cys auto -merge L,H


prompt:
$ INFO   Checking for cystine bridges, based on sulphur (SG) atoms lying 
closer than 0.2200 nm
$ INFO   Detected SS bridge between ('SG', 'CYS', 23, 'L') and ('SG', 
'CYS', 88, 'L') (0.204094 nm)
$ INFO   Detected SS bridge between ('SG', 'CYS', 134, 'L') and ('SG', 
'CYS', 194, 'L') (0.204489 nm)
$ INFO   Detected SS bridge between ('SG', 'CYS', 214, 'L') and ('SG', 
'CYS', 434, 'H') (0.203603 nm)
$ INFO   Detected SS bridge between ('SG', 'CYS', 358, 'H') and ('SG', 
'CYS', 414, 'H') (0.204830 nm)


I can see the four disulfide bonds information is reflected in the [ 
constraints ] section of "Protein_L+Protein_H.itp" file:
$   45   191  1   0.24000 ; Cys-bonds/special link
$  294   431  1   0.24000 ; Cys-bonds/special link
$  475   947  1   0.24000 ; Cys-bonds/special link
$  778   900  1   0.24000 ; Cys-bonds/special link


So how to also ensure the intactness of the interchain disulfide bond?


You can find my files at
https://github.com/lanselibai/martini/tree/master/20190129_disulfide


Thank you!


Yours sincerely
Cheng
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] "Too many LINCS warnings" in a minimization aftersolvation with coarse-grained waters

2019-01-22 Thread P C Kroon
Depends on your itp file. If you look in martini_v2.2P.itp at the definition of 
WP (polarized water) for example, you can see its constraints are wrapped as 
such:

#ifdef FLEXIBLE
; for minimization purposes constraints might be replaced by stiff bonds
[ bonds ]
1 2 1 0.14 1
1 3 1 0.14 1
#else
[ constraints ]
1 2 1 0.14
1 3 1 0.14
#endif

That means in that case you can just add a define = -DFLEXIBLE in your mdp 
file. Otherwise you’ll need to go through your itp by hand.

Peter

From: ZHANG Cheng
Sent: 22 January 2019 15:23
To: gromacs.org_gmx-users
Subject: Re: [gmx-users] "Too many LINCS warnings" in a minimization 
aftersolvation with coarse-grained waters

Dear Fotis and Peter,


Thank you very much for the help.


Fotis, Can I modify the mdp file to use "soft" potential modifier, how to do 
that?


I think my problem is not the first reason (i.e. something wrong with the 
system structure or topology), because the potential is decreasing for the 
minimization step after inserting 10 protein molecules
https://github.com/lanselibai/martini/blob/master/20190121_LINCS/minimization%20after%20insert%2010%20proteins.jpg
$ Steepest Descents converged to machine precision in 4251 steps,
$ but did not reach the requested Fmax < 1.
$ Potential Energy  = -2.4130119e+05
$ Maximum force =  9.1535597e+00 on atom 2335
$ Norm of force =  7.1063030e-01





Peter, how to replace all constraints for stiff bonds?





-- Original --
From:  "ZHANG Cheng"<272699...@qq.com>;
Date:  Tue, Jan 22, 2019 05:58 AM
To:  "gromacs.org_gmx-users";

Subject:  Re: "Too many LINCS warnings" in a minimization after solvation with 
coarse-grained waters



I also tried to reduce the "emtol" gradually in the mdp file, i.e. from 1000 to 
100 to 10. It passed the "emtol = 1000" but it stopped again at "emtol = 100", 
i.e. outputting dozens of pdb files before the "LINCS warnings".


Then I looked at the edr files.
The potential in "emtol = 1000" and "emtol = 100" runs were actually converging
https://github.com/lanselibai/martini/blob/master/20190121_LINCS/emtol%201000%20then%20100.png


My understanding for the "LINCS warnings" is, the system is not stable. But why 
the potential is still converging?


Do I need to adjust the "lincs warning threshold", or "set the environment 
variable GMX_MAXCONSTRWARN to -1"? How to do that?


Is there a "standard" mdp file for minimization for a coarse-grained system 
with 10 proteins in water?
I am using this, but I do not know how to modify it.
https://github.com/lanselibai/martini/blob/master/20190121_LINCS/minimization_solvate.mdp




-- Original --
From:  "ZHANG Cheng"<272699...@qq.com>;
Date:  Tue, Jan 22, 2019 00:12 AM
To:  "gromacs.org_gmx-users";

Subject:  "Too many LINCS warnings" in a minimization after solvation with 
coarse-grained waters



I am doing coarse-grained (CG) modelling for 10 proteins in a box. I was told 
"Too many LINCS warnings" in the minimization after solvation with 
coarse-grained waters.


I try to diagnose the problems based on 
http://manual.gromacs.org/documentation/2018/user-guide/terminology.html#blowing-up


My procedure is


1) A single CG-protein was firstly minimized in vacuum, no problem


2) Then 10 of this protein were inserted to a box, followed by a minimization. 
It "stopped because the algorithm tried to make a new step whose size was too 
small, or there was no change in the energy since last step." So I think this 
minimization is also successful.


3) The system was then solvated by
$ gmx solvate -cp 10_noW_minimized.gro -cs water-box-CG_303K-1bar.gro -radius 
0.21 -o system-solvated.gro -p system.top


4) Then the solvated system is minimized by
$ gmx grompp -f minimization_solvate.mdp -c system-solvated.gro -p system.top 
-o system-min-solvent.tpr
$ gmx mdrun -deffnm system-min-solvent -v -c system-min-solvent.gro

PDB structures were outputted from step 327 to step 710, and it stopped due to 
the "LINCS warnings".


The "minimization_solvate.mdp" is here
https://github.com/lanselibai/martini/blob/master/20190121_LINCS/minimization_solvate.mdp


The "system-min-solvent.log" is here
https://github.com/lanselibai/martini/blob/master/20190121_LINCS/system-min-solvent.log


So I think the system with 10 proteins in vacuum is okay (right?). But when 
CG-water is added, it got problem? How to modify my system? Let me know if you 
need other information. Thank you.


Cheng
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't 

Re: [gmx-users] secondary structure analysis

2018-09-24 Thread P C Kroon
Hi Soham,

In normal Martini you *can NOT* see secondary structure changes. It is usually 
fixed using either elastic bonds or extended dihedrals. If you apply neither it 
will generally collapse into a spherical ball of entropy. Please refer to e.g. 
Monticelli et al. [1]
Also, in general, analysis tool (VMD, DSSP) don’t like CG structures. They 
don’t recognise the names of the particles (and to be honest, I don’t blame 
them)

Peter

[1] L. Monticelli, S. K. Kandasamy, X. Periole, R. G. Larson, D. P. Tieleman, 
S. J. Marrink, J. Chem. Theory Comput., 2008, DOI:10.1021/ct700324x.

From: Soham Sarkar
Sent: 24 September 2018 14:10
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] secondary structure analysis

You can use do_dssp utility to generate the secondary structure

On Mon, 24 Sep 2018, 5:34 pm SHAHEE ISLAM,  wrote:

> hi,
> i am doing martini based coarsed coarse grained simulation in
> gromacs.after getting the coarse grained structure ,i can convert cg
> pdb it to the gro file.using vmd, seondary structure can be
> analysed.can any one suggest what may be the best way to calculate the
> secondary structure change of protein.
> thanking you
> shahee
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] POPG/POPE membrane

2018-07-18 Thread P C Kroon
Hi Marzieh,

You can construct an initial membrane conformations with a few tools. Just 
google for e.g. INSANE, CharmmGui, or membrane builder (or something along 
those lines).
I know there’s at least a tutorial for a mixed Martini (coarse-grained) 
membrane: 
http://cgmartini.nl/index.php/tutorials-general-introduction-gmx5/bilayers-gmx5#Complex-bilayers

Good luck!
Peter

From: marzieh dehghan
Sent: 18 July 2018 09:37
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: [gmx-users] POPG/POPE membrane

Hi
Dear all,
I want to simulate the combination of POPG/POPE membrane with a ratio 3:1
using gromacs. I would like to know is there any tutorial to use for making
the mentioned mix membrane?

I am looking forward to getting your answer,
Thanks a lot,
Marzieh
-- 




*Marzieh DehghanPhD of Biochemistry.Institute of Biochemistry and
Biophysics (IBB).University of Tehran, Tehran- Iran.*
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] openmpi execution using sbatch & mpirun every commandexecuted several times

2017-12-13 Thread P C Kroon
Although “don’t” does cover the best advice, there’s a workaround.

Srun -n 1 -c 1 gmx_mpi grompp ...

So run on 1 node (-n 1) with 1 CPU (-c 1). You’ll need to read the mpirun 
documentation to figure out the exact options for that.

Peter

From: Mark Abraham
Sent: 12 December 2017 21:45
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] openmpi execution using sbatch & mpirun every 
commandexecuted several times

Hi,

On Wed, Dec 13, 2017, 7:34 AM Vedat Durmaz  wrote:

>
> hi everybody,
>
> i'm working on an ubuntu 16.04 system with gromacs-openmpi (5.1.2)
> installed from the ubuntu repos. everything works fine when i submit
> job.slurm using sbatch where job.slurm roughly looks like this
>
> -
> #!/bin/bash -l
>
> #SBATCH -N 1
> #SBATCH -n 24
> #SBATCH -t 02:00:00
>
> cd $4
>
> cp file1 file2
> grompp ...
> mpirun -np 24 mdrun_mpi ...   ### WORKS FINE
> -
>
> and the mdrun_mpi output contains the following statements:
>
> -
> Using 24 MPI processes
> Using 1 OpenMP thread per MPI process
>
>   mdrun_mpi -v -deffnm em
>
> Number of logical cores detected (24) does not match the number reported
> by OpenMP (12).
> Consider setting the launch configuration manually!
>
> Running on 1 node with total 24 cores, 24 logical cores
> -
>
>
> now, i want to put the last 3 commands of job.slurm into an extra script
> (run_gmx.sh)
>

Don't. It doesn't work with any MPI application

Mark

-
> #!/bin/bash
>
> cp file1 file2
> grompp ...
> mdrun_mpi ...
> -
>
> that i start with mpirun
>
> -
> #!/bin/bash -l
>
> #SBATCH -N 1
> #SBATCH -n 24
> #SBATCH -t 02:00:00
>
> cd $4
> mpirun -np 24 run_gmx.sh
> -
>
> but now i get strange errors because every non-mpi programm (cp, grompp)
> is executed 24 times which gives a desaster in the file system.
>
>
> if i change job.slurm to
>
> -
> #!/bin/bash -l
>
> #SBATCH -N 1
> #SBATCH --ntasks-per-node=1
> #SBATCH -t 02:00:00
>
> cd $4
> mpirun run_gmx.sh
> -
>
> i get the following error in the output:
>
> Number of logical cores detected (24) does not match the number reported
> by OpenMP (1).
> Consider setting the launch configuration manually!
>
> Running on 1 node with total 24 cores, 24 logical cores
> Using 1 MPI process
> Using 24 OpenMP threads
>
> Fatal error:
> Your choice of 1 MPI rank and the use of 24 total threads leads to the use
> of 24 OpenMP threads, whereas we expect the optimum to be with more MPI
> ranks with 1 to 6 OpenMP threads. If you want to run with this many OpenMP
> threads, specify the -ntomp option. But we suggest to increase the number
> of MPI ranks.
>
>
> and if i use -ntomp
> -
> #!/bin/bash
>
> cp file1 file2
> grompp ...
> mdrun_mpi -ntomp 24 ...
> -
>
> things seem to work fine but mdrun_mpi is extremely slow as if it was
> running on one core only. and that's the output:
>
>   mdrun_mpi -ntomp 24 -v -deffnm em
>
> Number of logical cores detected (24) does not match the number reported
> by OpenMP (1).
> Consider setting the launch configuration manually!
>
> Running on 1 node with total 24 cores, 24 logical cores
>
> The number of OpenMP threads was set by environment variable
> OMP_NUM_THREADS to 24 (and the command-line setting agreed with that)
> Using 1 MPI process
> Using 24 OpenMP threads
>
> NOTE: You requested 24 OpenMP threads, whereas we expect the optimum to be
> with more MPI ranks with 1 to 6 OpenMP threads.
>
> Non-default thread affinity set probably by the OpenMP library,
> disabling internal thread affinity
>
>
>
> what am i doing wrong? what a the proper setting for my goal? i need to
> use an extra script executed with mpirun and i somehow need to reduce 24
> executions of serial commands to just one!
>
> any useful hint is appreciated.
>
> take care,
> vedat
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe 

Re: [gmx-users] Installing gromacs on the windows 10 linux subsystem

2017-12-11 Thread P C Kroon
I also have good experiences here. It’s even pretty straightforward to compile 
it. The only downside I’ve found so far is that it seems to be impossible to 
get GPU acceleration.

Peter

From: A. Alamir
Sent: 11 December 2017 05:15
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] Installing gromacs on the windows 10 linux subsystem


Thanks for your feedback.

Abbas

From: Dan Gil
Sent: Saturday, December 9, 2017 5:43 PM
To: gmx-us...@gromacs.org
Cc: 
gromacs.org_gmx-users@maillist.sys.kth.se
Subject: Re: [gmx-users] Installing gromacs on the windows 10 linux subsystem

Yes! I have gromacs installed and running on my machine.

Dan

On Sat, Dec 9, 2017 at 4:02 AM A. Alamir  wrote:

> Dear all,
>
> I would like to know if using gromacs on the Windows 10 ubuntu Linux
> subsystem (WLS) is supported. On WLS the gromacs package can be directly
> installed via the Ubuntu apt-get command. I know there are other
> windows-based alternatives to installing/using gromacs (ex. Cygwin), but on
> WLS it is straightforward, providing access to an almost native-like Linux
> experience.
>
> I have noticed that Amber MD can be installed/used on WLS (see
> http://ambermd.org/amber_install.html). Please let me know if you’ve
> successfully tested this for gromacs.
>
> Cheers,
>
> Abbas
>
>
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.