[gmx-users] Fw: Attempting REMD tutorial by Mark Abraham

2019-05-29 Thread Israel Estrada





From: Israel Estrada
Sent: Wednesday, May 29, 2019 3:15 PM
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: Attempting REMD tutorial by Mark Abraham


Hi all,

I'm attempting to follow Mark Abraham's introduction to REMD.


(http://www.gromacs.org/Documentation/Tutorials/GROMACS_USA_Workshop_and_Conference_2013/An_introduction_to_replica_exchange_simulations%3A_Mark_Abraham%2C_Session_1B)


I'm currently attempting and failing to run REMD equilibriations in stage 1, 
using the command:


mpirun -np 4 mdrun_mpi -v -multidir equil[0123]


However, I'm receiving an error "mpirun: cannot start mdrun_mpi on n0 (o): No 
such file or directory"


Any guidance would be very much appreciated. I'm certain it's a fairly simple 
fix, I'm just not well versed in this yet.

Thank you!

-Israel
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Segmentation fault, core dumped error

2019-05-29 Thread Dallas Warren
https://redmine.gromacs.org/account/register

Catch ya,

Dr. Dallas Warren
Drug Delivery, Disposition and Dynamics
Monash Institute of Pharmaceutical Sciences, Monash University
381 Royal Parade, Parkville VIC 3052
dallas.war...@monash.edu
-
When the only tool you own is a hammer, every problem begins to resemble a nail.




On Thu, 30 May 2019 at 12:32, Neena Susan Eappen
 wrote:
>
> Hi Paul Bauer,
>
> I do not have an account on redmine.gromacs.org neither that page allows me 
> to create an account. Please let me know what to do.
>
> Hi Najamuddin,
>
> I tried xtc file instead of trr, still same error shows up.
>
> Thank you,
> Neena
>
> 
> From: Neena Susan Eappen
> Sent: Wednesday, May 29, 2019 5:25 PM
> To: gromacs.org_gmx-users@maillist.sys.kth.se
> Subject: Re: [gmx-users] Segmentation fault, core dumped error
>
>
> Command: gmx hbond -f md_0_1.trr -s md_0_1.tpr
>
> Specify 2 groups to analyze:
>
> Group 0 (System) has   130 elements
>
> Group 1 (Protein) has   130 elements
>
> Group 2 (Protein-H) has63 elements
>
> Group 3 (C-alpha) has11 elements
>
> Group 4 (Backbone) has34 elements
>
> Group 5 (MainChain) has47 elements
>
> Group 6 (MainChain+Cb) has58 elements
>
> Group 7 (MainChain+H) has58 elements
>
> Group 8 (SideChain) has72 elements
>
> Group 9 (SideChain-H) has16 elements
>
>
> Select a group: 1 Selected 1: 'Protein'
>
> Select a group: 1 Selected 1: 'Protein'
>
> Output: Calculating hydrogen bonds in Protein (130 atoms)
>
> Found 13 donors and 25 acceptors
>
> trr version: GMX_trn_file (single precision)
>
> Reading frame   0 time0.000
>
> Will do grid-seach on 2374x2374x2374 grid, rcut=0.35
>
> Segmentation fault (core dumped)
>
> 
> From: Neena Susan Eappen
> Sent: Tuesday, May 28, 2019 11:31 PM
> To: gromacs.org_gmx-users@maillist.sys.kth.se
> Subject: Re: [gmx-users] Segmentation fault, core dumped error
>
> Hello gromacs users,
>
> Is there a reason why segmentation fault appeared when gmx hbond command was 
> used?
>
> Thank you,
> Neena
>
> 
> From: Neena Susan Eappen
> Sent: Friday, May 24, 2019 12:40 PM
> To: gromacs.org_gmx-users@maillist.sys.kth.se
> Subject: [gmx-users] Segmentation fault, core dumped error
>
> Hello gromacs users,
>
> I got an error message when I used gmx hbond command: Segmentation fault, 
> core dumped. Shown below is my gmx version details (if that can point to my 
> problem). Any insight would be appreciated.
>
> GROMACS version:2018.4
> Precision:  single
> Memory model:   64 bit
> MPI library:thread_mpi
> OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 64)
> GPU support:disabled
> SIMD instructions:  AVX2_256
> FFT library:fftw-3.3.8-sse2-avx-avx2-avx2_128-avx512
> RDTSCP usage:   enabled
> TNG support:enabled
> Hwloc support:  disabled
> Tracing support:disabled
> Built on:   2019-01-12 0:35
> Built by:   name@HP-PC [CMAKE]
> Build OS/arch:  Linux 4.4.0-17134-Microsoft x86_64
> Build CPU vendor:   Intel
> Build CPU brand:Intel(R) Core(TM) i3-5010U CPU @ 2.10GHz
> Build CPU family:   6   Model: 61   Stepping: 4
> Build CPU features: aes apic avx avx2 clfsh cmov cx8 cx16 f16c fma htt 
> intel lahf mmx msr nonstop_tsc pcid pclmuldq pdcm pdpe1gb popcnt pse rdrnd 
> rdtscp sse2 sse3 sse4.1 sse4.2 ssse3 tdt x2apic
> C compiler: /usr/bin/cc GNU 7.3.0
> C compiler flags:   march=core-AVX2 -O3 -DNDEBUG -funroll-all-loops 
> -fexcess-precision=fast
> C++ compiler:   /usr/bin/c++ GNU 7.3.0
> C++ compiler flags: -march=core-avx2-std=c++11   -O3 -DNDEBUG 
> -funroll-all-loops -fexcess-precision=fast
>
> Thank you,
> Neena
> --
> Gromacs Users mailing list
>
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
> mail to gmx-users-requ...@gromacs.org.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Segmentation fault, core dumped error

2019-05-29 Thread Neena Susan Eappen
Hi Paul Bauer,

I do not have an account on redmine.gromacs.org neither that page allows me to 
create an account. Please let me know what to do.

Hi Najamuddin,

I tried xtc file instead of trr, still same error shows up.

Thank you,
Neena


From: Neena Susan Eappen
Sent: Wednesday, May 29, 2019 5:25 PM
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: Re: [gmx-users] Segmentation fault, core dumped error


Command: gmx hbond -f md_0_1.trr -s md_0_1.tpr

Specify 2 groups to analyze:

Group 0 (System) has   130 elements

Group 1 (Protein) has   130 elements

Group 2 (Protein-H) has63 elements

Group 3 (C-alpha) has11 elements

Group 4 (Backbone) has34 elements

Group 5 (MainChain) has47 elements

Group 6 (MainChain+Cb) has58 elements

Group 7 (MainChain+H) has58 elements

Group 8 (SideChain) has72 elements

Group 9 (SideChain-H) has16 elements


Select a group: 1 Selected 1: 'Protein'

Select a group: 1 Selected 1: 'Protein'

Output: Calculating hydrogen bonds in Protein (130 atoms)

Found 13 donors and 25 acceptors

trr version: GMX_trn_file (single precision)

Reading frame   0 time0.000

Will do grid-seach on 2374x2374x2374 grid, rcut=0.35

Segmentation fault (core dumped)


From: Neena Susan Eappen
Sent: Tuesday, May 28, 2019 11:31 PM
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: Re: [gmx-users] Segmentation fault, core dumped error

Hello gromacs users,

Is there a reason why segmentation fault appeared when gmx hbond command was 
used?

Thank you,
Neena


From: Neena Susan Eappen
Sent: Friday, May 24, 2019 12:40 PM
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: [gmx-users] Segmentation fault, core dumped error

Hello gromacs users,

I got an error message when I used gmx hbond command: Segmentation fault, core 
dumped. Shown below is my gmx version details (if that can point to my 
problem). Any insight would be appreciated.

GROMACS version:2018.4
Precision:  single
Memory model:   64 bit
MPI library:thread_mpi
OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 64)
GPU support:disabled
SIMD instructions:  AVX2_256
FFT library:fftw-3.3.8-sse2-avx-avx2-avx2_128-avx512
RDTSCP usage:   enabled
TNG support:enabled
Hwloc support:  disabled
Tracing support:disabled
Built on:   2019-01-12 0:35
Built by:   name@HP-PC [CMAKE]
Build OS/arch:  Linux 4.4.0-17134-Microsoft x86_64
Build CPU vendor:   Intel
Build CPU brand:Intel(R) Core(TM) i3-5010U CPU @ 2.10GHz
Build CPU family:   6   Model: 61   Stepping: 4
Build CPU features: aes apic avx avx2 clfsh cmov cx8 cx16 f16c fma htt 
intel lahf mmx msr nonstop_tsc pcid pclmuldq pdcm pdpe1gb popcnt pse rdrnd 
rdtscp sse2 sse3 sse4.1 sse4.2 ssse3 tdt x2apic
C compiler: /usr/bin/cc GNU 7.3.0
C compiler flags:   march=core-AVX2 -O3 -DNDEBUG -funroll-all-loops 
-fexcess-precision=fast
C++ compiler:   /usr/bin/c++ GNU 7.3.0
C++ compiler flags: -march=core-avx2-std=c++11   -O3 -DNDEBUG 
-funroll-all-loops -fexcess-precision=fast

Thank you,
Neena
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] proper use of restraints

2019-05-29 Thread paul buscemi

Dear Users,

I’ve modeled a simple polymer ( nylon ), 600 atoms aligned in the x direction  
.   I create a restraint file for one molecule using the top file indices and 
fx,fy fz = 1 0 0  then place 10 to 100 copies in a box  On equilibration, 
they  exhibit the expected hydrogen bonding using the gromos54a7 ff.   Under 
npt pcouple - surface tension, the molecules remain elongated  and some 
coalesce  into groups but do not fully group  with ref p = 1 1  and comp = 4e-5 
 0 .  Just why the y direction does not shrink is a puzzle but not the main 
concern.   The x dimension fo the box remains constant too but I assumed that 
is because the ends of the polymers are fixed,  But there is room for packing 
in the y dimension.

Under p couple = isotropic, the ends re-enter the opoosite  (x) sides as the 
box collapses in x, y x directions  ( the original xyz  300 300 40 A.   I was 
under the impression that under relatively strong x restraint, the x direction 
would not collapse because the x direction is fixed at the ends  ???  Or is the 
way to look at it, the end ARE fixed, but the world around them is shrinking ?

Any comments ( well most any good ones ) would be appreciate.


Reference is made to the restraint file both in the molecule itp, and the base 
top file as an include statement - which is probably redundant if not outright 
wrong.

Paul
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Segmentation fault, core dumped error

2019-05-29 Thread Najamuddin Memon
Instead of .trr file you can use .tpr and .xtc file


On Wed, May 29, 2019, 10:25 PM Neena Susan Eappen <
neena.susaneap...@mail.utoronto.ca> wrote:

> Command: gmx hbond -f md_0_1.trr -s md_0_1.tpr
>
> Specify 2 groups to analyze:
>
> Group 0 (System) has   130 elements
>
> Group 1 (Protein) has   130 elements
>
> Group 2 (Protein-H) has63 elements
>
> Group 3 (C-alpha) has11 elements
>
> Group 4 (Backbone) has34 elements
>
> Group 5 (MainChain) has47 elements
>
> Group 6 (MainChain+Cb) has58 elements
>
> Group 7 (MainChain+H) has58 elements
>
> Group 8 (SideChain) has72 elements
>
> Group 9 (SideChain-H) has16 elements
>
>
> Select a group: 1 Selected 1: 'Protein'
>
> Select a group: 1 Selected 1: 'Protein'
>
> Output: Calculating hydrogen bonds in Protein (130 atoms)
>
> Found 13 donors and 25 acceptors
>
> trr version: GMX_trn_file (single precision)
>
> Reading frame   0 time0.000
>
> Will do grid-seach on 2374x2374x2374 grid, rcut=0.35
>
> Segmentation fault (core dumped)
>
> 
> From: Neena Susan Eappen
> Sent: Tuesday, May 28, 2019 11:31 PM
> To: gromacs.org_gmx-users@maillist.sys.kth.se
> Subject: Re: [gmx-users] Segmentation fault, core dumped error
>
> Hello gromacs users,
>
> Is there a reason why segmentation fault appeared when gmx hbond command
> was used?
>
> Thank you,
> Neena
>
> 
> From: Neena Susan Eappen
> Sent: Friday, May 24, 2019 12:40 PM
> To: gromacs.org_gmx-users@maillist.sys.kth.se
> Subject: [gmx-users] Segmentation fault, core dumped error
>
> Hello gromacs users,
>
> I got an error message when I used gmx hbond command: Segmentation fault,
> core dumped. Shown below is my gmx version details (if that can point to my
> problem). Any insight would be appreciated.
>
> GROMACS version:2018.4
> Precision:  single
> Memory model:   64 bit
> MPI library:thread_mpi
> OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 64)
> GPU support:disabled
> SIMD instructions:  AVX2_256
> FFT library:fftw-3.3.8-sse2-avx-avx2-avx2_128-avx512
> RDTSCP usage:   enabled
> TNG support:enabled
> Hwloc support:  disabled
> Tracing support:disabled
> Built on:   2019-01-12 0:35
> Built by:   name@HP-PC [CMAKE]
> Build OS/arch:  Linux 4.4.0-17134-Microsoft x86_64
> Build CPU vendor:   Intel
> Build CPU brand:Intel(R) Core(TM) i3-5010U CPU @ 2.10GHz
> Build CPU family:   6   Model: 61   Stepping: 4
> Build CPU features: aes apic avx avx2 clfsh cmov cx8 cx16 f16c fma htt
> intel lahf mmx msr nonstop_tsc pcid pclmuldq pdcm pdpe1gb popcnt pse rdrnd
> rdtscp sse2 sse3 sse4.1 sse4.2 ssse3 tdt x2apic
> C compiler: /usr/bin/cc GNU 7.3.0
> C compiler flags:   march=core-AVX2 -O3 -DNDEBUG
> -funroll-all-loops -fexcess-precision=fast
> C++ compiler:   /usr/bin/c++ GNU 7.3.0
> C++ compiler flags: -march=core-avx2-std=c++11   -O3 -DNDEBUG
> -funroll-all-loops -fexcess-precision=fast
>
> Thank you,
> Neena
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Gromacs error

2019-05-29 Thread Dallas Warren
http://manual.gromacs.org/documentation/current/onlinehelp/gmx-covar.html
Has the following note (as does 'gmx help covar'):
"Note that the diagonalization of a matrix requires memory and time
that will increase at least as fast as than the square of the number
of atoms involved. It is easy to run out of memory, in which case this
tool will probably exit with a ‘Segmentation fault’. You should
consider carefully whether a reduced set of atoms will meet your needs
for lower costs."

Plus see http://www.gromacs.org/Documentation/Errors#Cannot_allocate_memory

So it appears that the system size is too large to be able to perform
the calculation with the computer memory that you have available.  See
last link for suggestions on how to solve.

Catch ya,

Dr. Dallas Warren
Drug Delivery, Disposition and Dynamics
Monash Institute of Pharmaceutical Sciences, Monash University
381 Royal Parade, Parkville VIC 3052
dallas.war...@monash.edu
-
When the only tool you own is a hammer, every problem begins to resemble a nail.

On Wed, 29 May 2019 at 22:11, Budheswar Dehury  wrote:
>
> Dear GROMACS Developers and users,
>
> While trying to clustering analysis based on certain distance matrices, 
> however, while trying to perform PCA, I executed the following command, it 
> shows the following error. Need your valuable suggestion and feedback to get 
> rid of such errors.
>
> Thanking you
> Budheswar
>
> gmx covar -f pca.xtc -s pca_dummy.pdb -nofit -nomwa -nopbc
>   :-) GROMACS - gmx covar, 2019.2 (-:
>
> GROMACS is written by:
>  Emile Apol  Rossen Apostolov  Paul Bauer Herman J.C. 
> Berendsen
> Par Bjelkmar  Christian Blau   Viacheslav Bolnykh Kevin Boyd
>  Aldert van Buuren   Rudi van Drunen Anton Feenstra   Alan Gray
>   Gerrit Groenhof Anca HamuraruVincent Hindriksen  M. Eric Irrgang
>   Aleksei Iupinov   Christoph Junghans Joe Jordan Dimitrios Karkoulis
> Peter KassonJiri Kraus  Carsten Kutzner  Per Larsson
>   Justin A. LemkulViveca LindahlMagnus Lundborg Erik Marklund
> Pascal Merz Pieter MeulenhoffTeemu Murtola   Szilard Pall
> Sander Pronk  Roland Schulz  Michael ShirtsAlexey Shvetsov
>Alfons Sijbers Peter Tieleman  Jon Vincent  Teemu Virolainen
>  Christian WennbergMaarten Wolf
>and the project leaders:
> Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel
>
> Copyright (c) 1991-2000, University of Groningen, The Netherlands.
> Copyright (c) 2001-2018, The GROMACS development team at
> Uppsala University, Stockholm University and
> the Royal Institute of Technology, Sweden.
> check out http://www.gromacs.org for more information.
>
> GROMACS is free software; you can redistribute it and/or modify it
> under the terms of the GNU Lesser General Public License
> as published by the Free Software Foundation; either version 2.1
> of the License, or (at your option) any later version.
>
> GROMACS:  gmx covar, version 2019.2
> Executable:   /usr/local/gromacs/bin/gmx
> Data prefix:  /usr/local/gromacs
> Working dir:  /home/bd422/Desktop/BD/1
> Command line:
>   gmx covar -f pca.xtc -s pca_dummy.pdb -nofit -nomwa -nopbc
>
>
> WARNING: Masses and atomic (Van der Waals) radii will be guessed
>  based on residue and atom names, since they could not be
>  definitively assigned from the information in your input
>  files. These guessed numbers might deviate from the mass
>  and radius of the atom type. Please check the output
>  files if necessary.
>
>
> Choose a group for the covariance analysis
> Group 0 ( System) has 66506 elements
> Group 1 (Protein) has 66506 elements
> Group 2 (  Protein-H) has 66506 elements
> Group 3 (C-alpha) has 66506 elements
> Group 4 (   Backbone) has 66506 elements
> Group 5 (  MainChain) has 66506 elements
> Group 6 (   MainChain+Cb) has 66506 elements
> Group 7 (MainChain+H) has 66506 elements
> Group 8 (  SideChain) has 0 elements
> Group 9 (SideChain-H) has 0 elements
> Select a group: 0
> Selected 0: 'System'
>
> ---
> Program: gmx covar, version 2019.2
> Source file: src/gromacs/utility/smalloc.cpp (line 125)
>
> Fatal error:
> Not enough memory. Failed to calloc 39807432324 elements of size 4 for mat
> (called from file
> /home/bd422/Downloads/gromacs-2019.2/src/gromacs/gmxana/gmx_covar.cpp, line
> 260)
>
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read 

[gmx-users] Calculation of free energy of distance restraints alone

2019-05-29 Thread ZACHARY GVILDYS
Hello Everybody,

I am trying to determine the free energy of restraining and
"un-restraining" a protein-RNA complex while leaving on vdw and coulomb
interactions. My current free energy parameters are the following for one
of the mdp files in the lambda_state;
; Free energy control stuff
free_energy  = yes
init_lambda_state= 18
delta_lambda = 0
calc_lambda_neighbors= 1; only immediate neighboring windows
; Vectors of lambda specified here
; Each combination is an index that is retrieved from init_lambda_state for
each simulation
; init_lambda_state0123456789
 10   11   12   13   14   15   16   17   18   19   20
vdw_lambdas  = 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
coul_lambdas = 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
bonded_lambdas   = 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
restraint_lambdas= 0.00 0.05 0.10 0.15 0.20 0.25 0.30 0.35 0.40
0.45 0.50 0.55 0.60 0.65 0.70 0.75 0.80 0.85 0.90 0.95 1.00
mass_lambdas = 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
temperature_lambdas  = 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
couple-moltype   = RNA  ; name of moleculetype to decouple
couple-lambda0   = vdw-q ; only van der Waals interactions
couple-lambda1   = vdw-q ; turn off everything, in this case
only vdW
couple-intramol  = yes
nstdhdl  = 10


I have couple-inramol turned on because the RNA has distance restraints
along its backbone. What I am curious about is what the couple-lambda0 and
couple-lambda1 values should be. There are no restraint options, as far
that I know of, so I have both vdw and coulomb interactions on in both
states. However, when I ran this simulation and performed the gmx bar
function, I get a change in free energy of 0 and a warning that the initial
and final states are the same. Is there something that I am missing here?
Is it possible to determine the free energy from restraints and nothing
else using the built in GROMACS software? I am comfortable using my own
methods to calculate the free energy of restraining the system, however it
is less accurate than using the BAR method gromacs has implemented.

Thanks in advance for all the help!

Best,
Zach Gvildys
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Segmentation fault, core dumped error

2019-05-29 Thread Neena Susan Eappen
Command: gmx hbond -f md_0_1.trr -s md_0_1.tpr

Specify 2 groups to analyze:

Group 0 (System) has   130 elements

Group 1 (Protein) has   130 elements

Group 2 (Protein-H) has63 elements

Group 3 (C-alpha) has11 elements

Group 4 (Backbone) has34 elements

Group 5 (MainChain) has47 elements

Group 6 (MainChain+Cb) has58 elements

Group 7 (MainChain+H) has58 elements

Group 8 (SideChain) has72 elements

Group 9 (SideChain-H) has16 elements


Select a group: 1 Selected 1: 'Protein'

Select a group: 1 Selected 1: 'Protein'

Output: Calculating hydrogen bonds in Protein (130 atoms)

Found 13 donors and 25 acceptors

trr version: GMX_trn_file (single precision)

Reading frame   0 time0.000

Will do grid-seach on 2374x2374x2374 grid, rcut=0.35

Segmentation fault (core dumped)


From: Neena Susan Eappen
Sent: Tuesday, May 28, 2019 11:31 PM
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: Re: [gmx-users] Segmentation fault, core dumped error

Hello gromacs users,

Is there a reason why segmentation fault appeared when gmx hbond command was 
used?

Thank you,
Neena


From: Neena Susan Eappen
Sent: Friday, May 24, 2019 12:40 PM
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: [gmx-users] Segmentation fault, core dumped error

Hello gromacs users,

I got an error message when I used gmx hbond command: Segmentation fault, core 
dumped. Shown below is my gmx version details (if that can point to my 
problem). Any insight would be appreciated.

GROMACS version:2018.4
Precision:  single
Memory model:   64 bit
MPI library:thread_mpi
OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 64)
GPU support:disabled
SIMD instructions:  AVX2_256
FFT library:fftw-3.3.8-sse2-avx-avx2-avx2_128-avx512
RDTSCP usage:   enabled
TNG support:enabled
Hwloc support:  disabled
Tracing support:disabled
Built on:   2019-01-12 0:35
Built by:   name@HP-PC [CMAKE]
Build OS/arch:  Linux 4.4.0-17134-Microsoft x86_64
Build CPU vendor:   Intel
Build CPU brand:Intel(R) Core(TM) i3-5010U CPU @ 2.10GHz
Build CPU family:   6   Model: 61   Stepping: 4
Build CPU features: aes apic avx avx2 clfsh cmov cx8 cx16 f16c fma htt 
intel lahf mmx msr nonstop_tsc pcid pclmuldq pdcm pdpe1gb popcnt pse rdrnd 
rdtscp sse2 sse3 sse4.1 sse4.2 ssse3 tdt x2apic
C compiler: /usr/bin/cc GNU 7.3.0
C compiler flags:   march=core-AVX2 -O3 -DNDEBUG -funroll-all-loops 
-fexcess-precision=fast
C++ compiler:   /usr/bin/c++ GNU 7.3.0
C++ compiler flags: -march=core-avx2-std=c++11   -O3 -DNDEBUG 
-funroll-all-loops -fexcess-precision=fast

Thank you,
Neena
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Stress-Strain curve plot

2019-05-29 Thread Anh Vo
Hi,

Thank you very much for your answers. There are many questions that I have
searched online and looked up in GROMACS manual but still confused, so I
asked here. Your answers help to clarify them a lot for me. I really
appreciate it.

However, there are some questions that I'm still not very clear about them.
Please help me to understand them more if possible.

- When I simulated  an equilibrated bilayer membrane under deformation in
GROMACS and compared the stress-strain curve with previous results in
LAMMPS, I got lower yield stress and higher yield strain. I think this
means the phospholipid chains may not be as “grabby” toward one another or
the chains are not pulling away from one another as quickly.
Is this problem due to the incorrect bonding or constraint between the
molecules? If this is the case, which files or which parameters should I
look into to modify the bond or constraints?

- In the production step when the membrane is being deform (equibiaxial
tension with equal deformation velocities in X and Y direction), GROMACS
produceed such error: "The domain decomposition grid has shifted too much
in the Z-direction around cell 0 0 3. This should not have happened.
Running with fewer ranks might avoid this issue".
Is the Domain Decomposition automatically set in GROMACS for running
simulation and how to avoid this problem?

Thank you very much.

Anh Vo


>>
>> -Original Message-
>>
>> Message: 6
>>
>> Date: Wed, 29 May 2019 01:00:55 +0200
>>
>> From: Mark Abraham 
>>
>>
>>
>> Hi,
>>
>>
>>
>> On Tue, 28 May 2019 at 23:26, Anh Vo  wrote:
>>
>>
>>
>> > Hi all,
>>
>> >
>>
>> >
>>
>> > I am running GROMACS to simulate a simple phospholipid bilayer membrane
>>
>> > (POPC) under deformation, and plot the von Mises stress-strain curve
>>
>> > to determine yielding point.
>>
>> >
>>
>> >
>>
>> >
>>
>> > I have got the membrane deformed and extractred output pressures from
>>
>> > .edr file (option PRESS X, PRESS Y, PRESS Z) to calculate the stress,
>>
>> > then use that calculated stress and true strain to plot the curve. The
>>
>> > general shape of the curve is correct; however, the pressure value
>>
>> > fluctuates widely in GROMACS and the yield stress/ yield strain is
>>
>> > wrong. Why did I get the wrong value and where/ which parameter files
>>
>> > should I address to solve this problem?
>>
>> >
>>
>>
>>
>> Pressure is a macroscopic quantity, and fluctutates more on short time
>> scales and over small numbers of particles. Usually there is a mismatch in
>> expectations that it fluctuates in the same manner as e.g. temperature.
>>
>>
>>
>>
>>
>> > 1. I got a lower yield stress and a higher yield strain than it should
>>
>> > be (compared to previous research findings on similar problem using
>> LAMMPS).
>>
>> > Why did I get this problem, and what should I fix to get the right
>>
>> > yielding stress/strain values?
>>
>> >
>>
>>
>>
>> That depends on the models used. It might not be known that they are
>> capable of producing the right yielding values.
>>
>>
>>
>>
>>
>> > 2.  Is the wrong value due to the incorrect bonding or constraint
>>
>> > between the molecules? If this is the case, which files or which
>>
>> > parameters should I look into to modify the bond or constraints? Does
>>
>> > lower yield stress and higher yield strain means that the bond in
>> GROMACS is more ductile?
>>
>> >
>>
>>
>>
>> Good question. I don't know if those are relevant things to verify. Is
>> there knowledge e.g. in the LAMMPS community of practice?
>>
>>
>>
>>
>>
>> > 3.   Sometimes I run the same simulation several times without changing
>>
>> > anything to double check. I can get similar results most of the time,
>>
>> > but sometimes the simulation stop faster and an error is reported like
>>
>> > this ?The domain decomposition grid has shifted too much in the
>>
>> > Z-direction around cell 0 0 3. This should not have happened. Running
>>
>> > with fewer ranks might avoid this issue?.
>>
>> >
>>
>> > What does this error mean? What is domain decomposition? Is the Domain
>>
>> > Decomposition automatically set in GROMACS for running simulation?
>>
>> >
>>
>>
>>
>> When there are multiple pieces of hardware, GROMACS has to distribute the
>> work across them. Such errors indicate that the changes over time are
>> larger than might be expected in equilibrium (or near equilibrium)
>> simulations, which is what the implementations in GROMACS target. What
>> simulations produced such messages?
>>
>>
>>
>> Mark
>>
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Explaination of some terminology

2019-05-29 Thread Benson Muite


On 5/29/19 4:07 AM, Anh Vo wrote:

Hi all,

I'm new to GROMACS and there are some terms that I’m not very clear when
reading GROMACS materials. I have looked them up and read more but I’m
still confused. Please help me to clarify them.



  - What is improper vs. proper dihedral?



  - It is said that “If a dynamical system is ergodic, the ensemble
average becomes equal to the time average”.

What does it means if a system is ergodic? Why does ensemble
sampling becomes similar to time sampling if a system is ergodic?
Ensemble sampling becomes similar to time sampling because over a long 
enough time period, all points in phase space are explored by an ergodic 
system, thus instead of doing multiple simulations with the same global 
average quantities (such as total system energy), one can do one long 
simulation with the same global average quantity. Slightly more involved 
explanation at https://en.wikipedia.org/wiki/Ergodicity



Thank you very much.


Best,

Anh Vo

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

[gmx-users] Gromacs error

2019-05-29 Thread Budheswar Dehury
Dear GROMACS Developers and users,

While trying to clustering analysis based on certain distance matrices, 
however, while trying to perform PCA, I executed the following command, it 
shows the following error. Need your valuable suggestion and feedback to get 
rid of such errors.

Thanking you
Budheswar

gmx covar -f pca.xtc -s pca_dummy.pdb -nofit -nomwa -nopbc
  :-) GROMACS - gmx covar, 2019.2 (-:

GROMACS is written by:
 Emile Apol  Rossen Apostolov  Paul Bauer Herman J.C. Berendsen
Par Bjelkmar  Christian Blau   Viacheslav Bolnykh Kevin Boyd
 Aldert van Buuren   Rudi van Drunen Anton Feenstra   Alan Gray
  Gerrit Groenhof Anca HamuraruVincent Hindriksen  M. Eric Irrgang
  Aleksei Iupinov   Christoph Junghans Joe Jordan Dimitrios Karkoulis
Peter KassonJiri Kraus  Carsten Kutzner  Per Larsson
  Justin A. LemkulViveca LindahlMagnus Lundborg Erik Marklund
Pascal Merz Pieter MeulenhoffTeemu Murtola   Szilard Pall
Sander Pronk  Roland Schulz  Michael ShirtsAlexey Shvetsov
   Alfons Sijbers Peter Tieleman  Jon Vincent  Teemu Virolainen
 Christian WennbergMaarten Wolf
   and the project leaders:
Mark Abraham, Berk Hess, Erik Lindahl, and David van der Spoel

Copyright (c) 1991-2000, University of Groningen, The Netherlands.
Copyright (c) 2001-2018, The GROMACS development team at
Uppsala University, Stockholm University and
the Royal Institute of Technology, Sweden.
check out http://www.gromacs.org for more information.

GROMACS is free software; you can redistribute it and/or modify it
under the terms of the GNU Lesser General Public License
as published by the Free Software Foundation; either version 2.1
of the License, or (at your option) any later version.

GROMACS:  gmx covar, version 2019.2
Executable:   /usr/local/gromacs/bin/gmx
Data prefix:  /usr/local/gromacs
Working dir:  /home/bd422/Desktop/BD/1
Command line:
  gmx covar -f pca.xtc -s pca_dummy.pdb -nofit -nomwa -nopbc


WARNING: Masses and atomic (Van der Waals) radii will be guessed
 based on residue and atom names, since they could not be
 definitively assigned from the information in your input
 files. These guessed numbers might deviate from the mass
 and radius of the atom type. Please check the output
 files if necessary.


Choose a group for the covariance analysis
Group 0 ( System) has 66506 elements
Group 1 (Protein) has 66506 elements
Group 2 (  Protein-H) has 66506 elements
Group 3 (C-alpha) has 66506 elements
Group 4 (   Backbone) has 66506 elements
Group 5 (  MainChain) has 66506 elements
Group 6 (   MainChain+Cb) has 66506 elements
Group 7 (MainChain+H) has 66506 elements
Group 8 (  SideChain) has 0 elements
Group 9 (SideChain-H) has 0 elements
Select a group: 0
Selected 0: 'System'

---
Program: gmx covar, version 2019.2
Source file: src/gromacs/utility/smalloc.cpp (line 125)

Fatal error:
Not enough memory. Failed to calloc 39807432324 elements of size 4 for mat
(called from file
/home/bd422/Downloads/gromacs-2019.2/src/gromacs/gmxana/gmx_covar.cpp, line
260)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Question of constraining bonds

2019-05-29 Thread Dong Woo Kang
Dear all gromacs users,I'm new to gromacs, and I have some questions about bond 
constraints.Now I have a system with methane, hydrogen, and TIP4P/Ew water. In 
special, topology of hydrogen molecule is including [ constraints ], for 
constructing a virtual site in the center of hydrogen molecule.By including 
"constraint-algorithm = lincs" and "constraints = h-bonds", I found that all 
bonds including H were constrained.So I changed *.mdp file with 
"constraint-algorithm = lincs" and "constraints = none", and I found that LINCS 
are applied to hydrogen, but it seemed that SETTLE is applied to water instead 
of LINCS.If I want to apply LINCS both on hydrogen molecules and waters, and 
not on methane or other hydrocarbons including hydrogen atoms, how can I do 
that?Any advices will be very helpful.Dong Woo Kang
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Segmentation fault, core dumped error

2019-05-29 Thread Paul bauer

Hello,

can you please open a new issue on redmine.gromacs.org for this, and 
upload your input files (if possible) and the command line you used that 
caused the error?


Thanks

Paul

On 29/05/2019 01:31, Neena Susan Eappen wrote:

Hello gromacs users,

Is there a reason why segmentation fault appeared when gmx hbond command was 
used?

Thank you,
Neena


From: Neena Susan Eappen
Sent: Friday, May 24, 2019 12:40 PM
To: gromacs.org_gmx-users@maillist.sys.kth.se
Subject: [gmx-users] Segmentation fault, core dumped error

Hello gromacs users,

I got an error message when I used gmx hbond command: Segmentation fault, core 
dumped. Shown below is my gmx version details (if that can point to my 
problem). Any insight would be appreciated.

GROMACS version:2018.4
Precision:  single
Memory model:   64 bit
MPI library:thread_mpi
OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 64)
GPU support:disabled
SIMD instructions:  AVX2_256
FFT library:fftw-3.3.8-sse2-avx-avx2-avx2_128-avx512
RDTSCP usage:   enabled
TNG support:enabled
Hwloc support:  disabled
Tracing support:disabled
Built on:   2019-01-12 0:35
Built by:   name@HP-PC [CMAKE]
Build OS/arch:  Linux 4.4.0-17134-Microsoft x86_64
Build CPU vendor:   Intel
Build CPU brand:Intel(R) Core(TM) i3-5010U CPU @ 2.10GHz
Build CPU family:   6   Model: 61   Stepping: 4
Build CPU features: aes apic avx avx2 clfsh cmov cx8 cx16 f16c fma htt 
intel lahf mmx msr nonstop_tsc pcid pclmuldq pdcm pdpe1gb popcnt pse rdrnd 
rdtscp sse2 sse3 sse4.1 sse4.2 ssse3 tdt x2apic
C compiler: /usr/bin/cc GNU 7.3.0
C compiler flags:   march=core-AVX2 -O3 -DNDEBUG -funroll-all-loops 
-fexcess-precision=fast
C++ compiler:   /usr/bin/c++ GNU 7.3.0
C++ compiler flags: -march=core-avx2-std=c++11   -O3 -DNDEBUG 
-funroll-all-loops -fexcess-precision=fast

Thank you,
Neena



--
Paul Bauer, PhD
GROMACS Release Manager
KTH Stockholm, SciLifeLab
0046737308594

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] GROMACS 2018.7 patch release available

2019-05-29 Thread Paul bauer

Hi GROMACS users,

I just released another patch release of GROMACS 2018, bringing the 
official version to 2018.7!


We decided to release this additional patch due to issues found in 
2018.6 that could affect scientific correctness, so we encourage all 
users of the 2018
series to update to 2018.7. Please see the link to the release notes 
below for more details.
The 2018 branch continues to be in support only for issues that affect 
scientific issues, while the 2019 branch is under active support.


You can find the code, documentation, release notes, and test suite at 
the links below.


Code: ftp://ftp.gromacs.org/pub/gromacs/gromacs-2018.7.tar.gz
Documentation: http://manual.gromacs.org/2018.7/index.html
(including release notes, install guide, user guide, reference manual)
Test Suite: http://gerrit.gromacs.org/download/regressiontests-2018.7.tar.gz

Happy simulating!

Paul

--
Paul Bauer, PhD
GROMACS Release Manager
KTH Stockholm, SciLifeLab
0046737308594

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Timestep in GROMACS simulation

2019-05-29 Thread John Whittaker
Hi,

> I'm still not clear what "sampling" means. Does it mean collecting data
> from the simulation?

Pretty much. The point of a molecular dynamics simulation is to calculate
structural, thermodynamic, etc... properties of some system from the
microscopic dynamics of atoms/molecules/whatever. Collecting this
information from the simulation is "sampling" that property from its
overall distribution. Typically, the longer you simulate, the better your
sampling is, but this isn't *always* true or feasible (e.g., development
of enhanced sampling methods to speed this process up).

By the way, the majority of the questions you have asked could probably be
answered much quicker and more in-depth if you just searched for them on
Google or took a look in the GROMACS manual. A massive amount of effort
has been put into tackling the MD "sampling problem", so you'll find lots
of information immediately.

Best,

John


-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Computational electrophysiology (compEL) setup issues (Kutzner, Carsten)

2019-05-29 Thread Kutzner, Carsten
Dear Francesco,

> On 29. May 2019, at 10:01, Francesco Petrizzelli  
> wrote:
> 
>> A cyl0-r (and cyl1-r) of 0.7 nm is too small for a pore radius of 3 A to 
>> reliably track the ions, some might sneak through your channel without being 
>> recorded in the cylinder. Rather choose this value a bit too large than too 
>> small. It should be *at least* half of the pore radius.
> 
> Regarding this aspect I'm not sure I got what you mean. The pore radius is 3 
> A, and I set up the cyl0-radius of 0.7 nm, which should correspond to about 7 
> A. You said it should be half of the pore radius and I am confused about 
> this. If this can help, the structure that I am using to test this tecnique 
> (I am planning to perform a large number of simulation on different channels) 
> is 5va1 (KCNH2). Actually, I recalculated the pore radius and it is larger 
> than 3 A because I have only considered the selectivity filter.
Sorry my bad! If the pore radius is 0.3 nm, then 0.7 nm cylinder radius is 
sufficient.
The idea is to select the cylinder dimensions such that each ion that 
permeates through the channel *has* to go through the cylinder. This is only 
for counting
how many ions permeate through channel A versus channel B. The CompEL ion/water 
swapping
protocol will work independent of the permeations counting as to perform the 
swaps
it is only important how many ions of each type are in each compartment.

Best,
  Carsten

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Computational electrophysiology (compEL) setup issues (Kutzner, Carsten)

2019-05-29 Thread Francesco Petrizzelli
Thank Dr Kursten for your reply. I really apprecciate your work on compEL.
>It is even easier to just generate a single bilayer with a channel, and then 
>duplicate the whole system by stacking two copies on top of each other, e.g. 
>with the script found on page www.mpibpc.mpg.de/grubmueller/compel under 
>www.mpibpc.mpg.de/15388676/makeSandwich.tgz
Thanks for the suggestion. I have choosed packmol-memgen because I'm a novel 
gmx user and I felt more confident with Amber setup (I am performed 
minimization and equilibration with it and then converted it to Gromacs), but 
after this training period I will use this script to duplicate the system.

>The .xvg output file also lists the z-position of your split group (=channel) 
>centers over time. Can you check in a molecular viewer whether these make 
>sense? They should be in the middle (regarding z coordinate) of each of the 
>two membranes. This is important, because these z-positions define the 
>compartment boundaries.

I have checked this and the z-position and it is exactly in the middle of each 
of the two membranes (in details it maps in the channel center).

>A cyl0-r (and cyl1-r) of 0.7 nm is too small for a pore radius of 3 A to 
>reliably track the ions, some might sneak through your channel without being 
>recorded in the cylinder. Rather choose this value a bit too large than too 
>small. It should be *at least* half of the pore radius.

Regarding this aspect I'm not sure I got what you mean. The pore radius is 3 A, 
and I set up the cyl0-radius of 0.7 nm, which should correspond to about 7 A. 
You said it should be half of the pore radius and I am confused about this. If 
this can help, the structure that I am using to test this tecnique (I am 
planning to perform a large number of simulation on different channels) is 5va1 
(KCNH2). Actually, I recalculated the pore radius and it is larger than 3 A 
because I have only considered the selectivity filter.

>You set -1, so if ions move through the channels, the protocol restores the 
>numbers found at the start of the simulation, which may or may not give you an 
>ionic imbalance and a voltage across the membrane.
You need to put positive numbers there, which add up to the actual number of 
ions in the simulation, e.g. let?s say you have a total of 20 Cl- and 24 K+ :
iontype0-name = Cl-
iontype0-in-A = 10  # means 10 Cl- in compartment A
iontype0-in-B = 10  # means 10 Cl- in compartment B as well
iontype1-name = K+
iontype1-in-A = 12  # 13 K+ in A
iontype1-in-B = 12  # 11 K+ in B
This will set and keep an imbalance of two elementary charges between the 
compartments, that will be set at the first time step by exchanging an 
appropriate number of ions and water between A and B.

This aspect is much more clear now. Thank you again.

Bests,
Francesco
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.