Re: [gmx-users] Define intermolecular interactions in L-J simulation

2019-10-23 Thread David van der Spoel

Den 2019-10-23 kl. 18:22, skrev Li, Shi:

Dear GMX users,

I am wondering if there is a way to define the intermolecular interaction
to simulation a binary LJ system. For example, I have two atoms A and B,
they share the same LJ parameter, and I want to change the interaction
parameter between A and B, so that I would expect different behaviors from
the simulation (mix or phase separation).

I checked the manual and found this can be defined in the topology file as
[intermolecular_interactions] and use the [pairs] interaction then list the
atom pairs. But it is still confusing if I have a system containing 500 A
and 500 B, how can I apply this to the entire binary system. I was
assumed that I can use atomtype instead of atom number? But how and where
to specify that?

Any suggestions?

Thanks,
Shi


Use
[ nonbonded_types ]
A A 1  c6 c12
B B 1  c6 c12
A B 1  c6 c12

--
David van der Spoel, Ph.D., Professor of Biology
Head of Department, Cell & Molecular Biology, Uppsala University.
Box 596, SE-75124 Uppsala, Sweden. Phone: +46184714205.
http://www.icm.uu.se
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] QM/MM run tips or tutorials

2019-10-23 Thread Bogdanov, Vladimir
Hi all,

I would like to try Gromacs 2018 with QM. I am good enough with running MD on 
Gromacs, but have never used QM. Any tips how to get started or any links to 
tutorials will be very helpful.

Best regards,
Vlad

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Define intermolecular interactions in L-J simulation

2019-10-23 Thread Justin Lemkul




On 10/23/19 12:22 PM, Li, Shi wrote:

Dear GMX users,

I am wondering if there is a way to define the intermolecular interaction
to simulation a binary LJ system. For example, I have two atoms A and B,
they share the same LJ parameter, and I want to change the interaction
parameter between A and B, so that I would expect different behaviors from
the simulation (mix or phase separation).

I checked the manual and found this can be defined in the topology file as
[intermolecular_interactions] and use the [pairs] interaction then list the


[intermolecular_interactions] are for bonded interactions and [pairs] 
are for 1-4 interactions.



atom pairs. But it is still confusing if I have a system containing 500 A
and 500 B, how can I apply this to the entire binary system. I was
assumed that I can use atomtype instead of atom number? But how and where
to specify that?


It sounds like you want to override combination rules, in which case you 
want [nonbond_params] to create an off-diagonal LJ interaction. A and B 
need to be defined as different atom types but can have the same LJ so 
A-A and B-B interactions obey the combination rule but A-B can be 
whatever you want it to be.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Define intermolecular interactions in L-J simulation

2019-10-23 Thread Li, Shi
Dear GMX users,

I am wondering if there is a way to define the intermolecular interaction
to simulation a binary LJ system. For example, I have two atoms A and B,
they share the same LJ parameter, and I want to change the interaction
parameter between A and B, so that I would expect different behaviors from
the simulation (mix or phase separation).

I checked the manual and found this can be defined in the topology file as
[intermolecular_interactions] and use the [pairs] interaction then list the
atom pairs. But it is still confusing if I have a system containing 500 A
and 500 B, how can I apply this to the entire binary system. I was
assumed that I can use atomtype instead of atom number? But how and where
to specify that?

Any suggestions?

Thanks,
Shi
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Reg: Zinc ion gets displaced during Protein-Zn-Ligand simulation

2019-10-23 Thread Justin Lemkul




On 10/22/19 12:04 AM, Amit Jaiswal wrote:

Dear Jorden,
Thanks for your reply. As you have suggested, i found there is a 
mismatch of the atom number in the zn.itp file and the .gro file. I 
have included few residues of .gro file for your convenience.
What i understand is that I have to rename the zn.itp file with 
residue no. 4265 and not 2220. Please correct me if I am wrong.


The global atom number is irrelevant in defining position restraints 
(and will actually trigger a warning in your case). If Zn is defined as 
a separate [moleculetype] in its own .itp file, the only valid atom 
number for position restraints is 1. See 
http://manual.gromacs.org/current/user-guide/run-time-errors.html#atom-index-n-in-position-restraints-out-of-bounds


-Justin

And also you suggested me to "minimize the system very carefully". 
What do you mean by this? Should i use lesser minimisation steps ?

Thanks for your time and efforts.
With kind regards,
Amit
391THR HG22 4260 4.897 5.356 3.136
391THR HG23 4261 4.889 5.247 2.993
391THR C 4262 4.600 5.071 3.244
391THR OT1 4263 4.599 5.012 3.355
391THR OT2 4264 4.496 5.082 3.173
392ZN ZN 4265 7.278 6.612 5.838
393NAD PA 4266 6.217 7.359 2.802
393NAD O1A 4267 6.090 7.410 2.863
393NAD O2A 4268 6.337 7.451 2.808
393NAD O5B 4269 6.185 7.331 2.647
393NAD C5B 4270 6.082 7.233 2.620
19.10.2019, 21:53, "Jorden Cabal" :

Dear Amit,
Your files look correct to me. If "2220" atom in your coordinate
file is
the "Zn" atom, it should not be displaced because, from your mdp
file and
topology setting you have restrained all the heavy atoms of
Protein, Nad
and Zn. I don't understand why it is happening. Even the
restraining force
you are taking is good enough.
I suggest you to check if the atom number 2220 in the co-ordinate file
(.gro file) is Zn atom or not? If it is not then you have wrongly
selected
atom number for restraining. Also, if you are following the standard
tutorial for energy minimization which do not restrain any atom, I
suggest
you to check the position of Zn atom in structure you get after energy
minimization. If the location of Zn ion is changed during the EM,
then you
will need to minimize the system very carefully.
Hope this will fix your issue.
Thank you

--
Gromacs Users mailing list


* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
or send a mail to gmx-users-requ...@gromacs.org
.




--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] regression test errors

2019-10-23 Thread Dave M
Thanks a lot Paul.

best regards,
D

On Wed, Oct 23, 2019 at 4:44 AM Paul bauer  wrote:

> Hello Dave,
>
> I thought it was something like that.
> The error is harmless (just telling you that MPI is doing its job), and
> the testing script gets confused because of the extra message in the
> output file.
>
> So I think you are good to go (and we need to do something about the
> testing script).
>
> Happy simulating!
>
> Cheers
>
> Paul
>
> On 23/10/2019 13:39, Dave M wrote:
> > Hi Paul,
> >
> > I checked using this command for a specific folder, and I used '-mpirun
> > mdrun' rather '-mpirun mpirun':
> >
> >
> > ./gmxtest.pl -mpirun mdrun -np 2 -noverbose rotation
> >
> >
> >
> > I get lot of these errors:
> >
> >
> > topol.tpr file different from ./reference_s.tpr. Check files in flex for
> > flex
> >
> > FAILED. Check checktpr.out, checktpr.err file(s) in flex for flex
> >
> > topol.tpr file different from ./reference_s.tpr. Check files in flex-t
> for
> > flex-t
> >
> > FAILED. Check checktpr.out, checktpr.err file(s) in flex-t for flex-t
> >
> > topol.tpr file different from ./reference_s.tpr. Check files in flex2 for
> > flex2
> >
> > FAILED. Check checktpr.out, checktpr.err file(s) in flex2 for flex2
> >
> > topol.tpr file different from ./reference_s.tpr. Check files in flex2-t
> for
> > flex2-t
> >
> > FAILED. Check checktpr.out, checktpr.err file(s) in flex2-t for flex2-t
> >
> >
> >  so on
> >
> >
> >
> > A)  the only suspicious thing I see in checktpr.err is possibly different
> > software versions.
> >
> >
> > Command line:
> >
> >gmx_mpi check -s1 ./reference_s.tpr -s2 topol.tpr -tol 0.0001 -abstol
> > 0.001
> >
> >
> > Note: When comparing run input files, default tolerances are reduced.
> >
> > Reading file ./reference_s.tpr, VERSION 5.0-beta2-dev-20140130-02adca5
> > (single precision)
> >
> > Note: file tpx version 96, software tpx version 116
> >
> > Reading file topol.tpr, VERSION 2019.4 (single precision)
> >
> >
> >
> >
> > B) And Only suspicious thing I see in checktpr.out is pasted below (I
> have
> > removed the host ip number). Just to mention I use Amazon web services so
> > probably the following error is related to instance when it was created
> and
> > then stored as an image and then re-used with a different ip. May be am
> > just talking silly!
> >
> >
> > [[2115,1],0]: A high-performance Open MPI point-to-point messaging module
> >
> > was unable to find any relevant network interfaces:
> >
> >
> > Module: OpenFabrics (openib)
> >
> >Host: ip-xxx-xx-xx-xxx
> >
> >
> > Another transport will be used instead, although this may result in
> >
> > lower performance.
> >
> >
> > NOTE: You can disable this warning by setting the MCA parameter
> >
> > btl_base_warn_component_unused to 0.
> >
> >
> > On Wed, Oct 23, 2019 at 4:10 AM Paul bauer 
> wrote:
> >
> >> Hello Dave,
> >>
> >> this is weird, no idea why it didn't work then.
> >> You can try running the test suite manually in the folder you found with
> >>
> >> perl gmxtest.pl -mpirun mpirun -np X -noverbose
> >>
> >> That will show if the test binary works and should report any failing
> >> tests.
> >> Don't forget to source the GMXRC file before trying, though!
> >>
> >> Cheers
> >>
> >> Paul
> >>
> >>
> >> On 23/10/2019 12:36, Dave M wrote:
> >>> Hi Paul,
> >>>
> >>> Thanks for the 'mpirun -n X gmx_mpi mdrun'. It works now.
> >>>
> >>> Regarding tests, I found the folder here build/tests/regressiontests
> >>> So I checked all the log using a simple script (searching keyword
> >>> 'Finished') and it shows that all the log files have Finished properly
> >> in
> >>> their corresponding folders. So log files do not say anything here.
> >>>
> >>> On Wed, Oct 23, 2019 at 3:07 AM Paul Bauer 
> >> wrote:
>  Hello Dave,
> 
>  You need to use mpirun -n (number of processes) gmx_mpi mdrun to use a
> >> MPI
>  enabled build of GROMACS. This is what the error message tries to tell
> >> you,
>  but we might need to improve on this.
> 
>  There should be a regressiontests folder somewhere in your build tree
> >> if it
>  downloaded the tests correctly.
> 
>  Cheers
> 
>  Paul
> 
>  On Wed, 23 Oct 2019, 12:02 Dave M,  wrote:
> 
> > Hi Paul,
> >
> > Thanks for your reply.
> > a) I just checked there is no tests/regressiontests, some other
> folder
> >> is
> > there test/sphysicalvalidation
> > There is no log file.
> > b) Regarding thread-mpi I think it is not installed because when I
> use
>  some
> > command like this:
> >
> >
> > gmx_mpi mdrun -v -deffnm 03-run -rdd 2.0 -nt 2
> >
> > I get an error:
> >
> >
> > Fatal error:
> >
> > Setting the total number of threads is only supported with thread-MPI
> >> and
> > GROMACS was compiled without thread-MPI
> >
> > I think (please correct me) gmx_mpi is for external MPI  openMPI in
> my
>  case
> > so I tried just 'gmx mdrun' 

Re: [gmx-users] regression test errors

2019-10-23 Thread Paul bauer

Hello Dave,

I thought it was something like that.
The error is harmless (just telling you that MPI is doing its job), and 
the testing script gets confused because of the extra message in the 
output file.


So I think you are good to go (and we need to do something about the 
testing script).


Happy simulating!

Cheers

Paul

On 23/10/2019 13:39, Dave M wrote:

Hi Paul,

I checked using this command for a specific folder, and I used '-mpirun
mdrun' rather '-mpirun mpirun':


./gmxtest.pl -mpirun mdrun -np 2 -noverbose rotation



I get lot of these errors:


topol.tpr file different from ./reference_s.tpr. Check files in flex for
flex

FAILED. Check checktpr.out, checktpr.err file(s) in flex for flex

topol.tpr file different from ./reference_s.tpr. Check files in flex-t for
flex-t

FAILED. Check checktpr.out, checktpr.err file(s) in flex-t for flex-t

topol.tpr file different from ./reference_s.tpr. Check files in flex2 for
flex2

FAILED. Check checktpr.out, checktpr.err file(s) in flex2 for flex2

topol.tpr file different from ./reference_s.tpr. Check files in flex2-t for
flex2-t

FAILED. Check checktpr.out, checktpr.err file(s) in flex2-t for flex2-t


 so on



A)  the only suspicious thing I see in checktpr.err is possibly different
software versions.


Command line:

   gmx_mpi check -s1 ./reference_s.tpr -s2 topol.tpr -tol 0.0001 -abstol
0.001


Note: When comparing run input files, default tolerances are reduced.

Reading file ./reference_s.tpr, VERSION 5.0-beta2-dev-20140130-02adca5
(single precision)

Note: file tpx version 96, software tpx version 116

Reading file topol.tpr, VERSION 2019.4 (single precision)




B) And Only suspicious thing I see in checktpr.out is pasted below (I have
removed the host ip number). Just to mention I use Amazon web services so
probably the following error is related to instance when it was created and
then stored as an image and then re-used with a different ip. May be am
just talking silly!


[[2115,1],0]: A high-performance Open MPI point-to-point messaging module

was unable to find any relevant network interfaces:


Module: OpenFabrics (openib)

   Host: ip-xxx-xx-xx-xxx


Another transport will be used instead, although this may result in

lower performance.


NOTE: You can disable this warning by setting the MCA parameter

btl_base_warn_component_unused to 0.


On Wed, Oct 23, 2019 at 4:10 AM Paul bauer  wrote:


Hello Dave,

this is weird, no idea why it didn't work then.
You can try running the test suite manually in the folder you found with

perl gmxtest.pl -mpirun mpirun -np X -noverbose

That will show if the test binary works and should report any failing
tests.
Don't forget to source the GMXRC file before trying, though!

Cheers

Paul


On 23/10/2019 12:36, Dave M wrote:

Hi Paul,

Thanks for the 'mpirun -n X gmx_mpi mdrun'. It works now.

Regarding tests, I found the folder here build/tests/regressiontests
So I checked all the log using a simple script (searching keyword
'Finished') and it shows that all the log files have Finished properly

in

their corresponding folders. So log files do not say anything here.

On Wed, Oct 23, 2019 at 3:07 AM Paul Bauer 

wrote:

Hello Dave,

You need to use mpirun -n (number of processes) gmx_mpi mdrun to use a

MPI

enabled build of GROMACS. This is what the error message tries to tell

you,

but we might need to improve on this.

There should be a regressiontests folder somewhere in your build tree

if it

downloaded the tests correctly.

Cheers

Paul

On Wed, 23 Oct 2019, 12:02 Dave M,  wrote:


Hi Paul,

Thanks for your reply.
a) I just checked there is no tests/regressiontests, some other folder

is

there test/sphysicalvalidation
There is no log file.
b) Regarding thread-mpi I think it is not installed because when I use

some

command like this:


gmx_mpi mdrun -v -deffnm 03-run -rdd 2.0 -nt 2

I get an error:


Fatal error:

Setting the total number of threads is only supported with thread-MPI

and

GROMACS was compiled without thread-MPI

I think (please correct me) gmx_mpi is for external MPI  openMPI in my

case

so I tried just 'gmx mdrun' (not gmx_mpi) but then it says command not
found. I am not sure what I missed in installation cmake flags.

Dave

On Wed, Oct 23, 2019 at 2:45 AM Paul Bauer 

wrote:

Hello Dave,

Did you have a look into the log files from the regression tests under
tests/regressiontests?
They might give us some insight into what is happening.

The warning in respect to thread-MPI is harmless, it just tells you

that

you are using real MPI instead of thread-MPI.

Cheers
Paul

On Wed, 23 Oct 2019, 07:36 Dave M,  wrote:


Hi All,

Any hints/help much appreciated why am getting regression tests

failure.

Also to mention I think thread-mpi was not installed as I got an

error

saying "MPI is not compatible with thread-MPI. Disabling thread-MPI".

How

to check the compatibility?

Thanks.

best regards,
D

On Sun, Oct 20, 2019 at 2:58 AM Dave M 

wrote:

Hi All,

I am trying 

Re: [gmx-users] regression test errors

2019-10-23 Thread Dave M
Hi Paul,

I checked using this command for a specific folder, and I used '-mpirun
mdrun' rather '-mpirun mpirun':


./gmxtest.pl -mpirun mdrun -np 2 -noverbose rotation



I get lot of these errors:


topol.tpr file different from ./reference_s.tpr. Check files in flex for
flex

FAILED. Check checktpr.out, checktpr.err file(s) in flex for flex

topol.tpr file different from ./reference_s.tpr. Check files in flex-t for
flex-t

FAILED. Check checktpr.out, checktpr.err file(s) in flex-t for flex-t

topol.tpr file different from ./reference_s.tpr. Check files in flex2 for
flex2

FAILED. Check checktpr.out, checktpr.err file(s) in flex2 for flex2

topol.tpr file different from ./reference_s.tpr. Check files in flex2-t for
flex2-t

FAILED. Check checktpr.out, checktpr.err file(s) in flex2-t for flex2-t


 so on



A)  the only suspicious thing I see in checktpr.err is possibly different
software versions.


Command line:

  gmx_mpi check -s1 ./reference_s.tpr -s2 topol.tpr -tol 0.0001 -abstol
0.001


Note: When comparing run input files, default tolerances are reduced.

Reading file ./reference_s.tpr, VERSION 5.0-beta2-dev-20140130-02adca5
(single precision)

Note: file tpx version 96, software tpx version 116

Reading file topol.tpr, VERSION 2019.4 (single precision)




B) And Only suspicious thing I see in checktpr.out is pasted below (I have
removed the host ip number). Just to mention I use Amazon web services so
probably the following error is related to instance when it was created and
then stored as an image and then re-used with a different ip. May be am
just talking silly!


[[2115,1],0]: A high-performance Open MPI point-to-point messaging module

was unable to find any relevant network interfaces:


Module: OpenFabrics (openib)

  Host: ip-xxx-xx-xx-xxx


Another transport will be used instead, although this may result in

lower performance.


NOTE: You can disable this warning by setting the MCA parameter

btl_base_warn_component_unused to 0.


On Wed, Oct 23, 2019 at 4:10 AM Paul bauer  wrote:

> Hello Dave,
>
> this is weird, no idea why it didn't work then.
> You can try running the test suite manually in the folder you found with
>
> perl gmxtest.pl -mpirun mpirun -np X -noverbose
>
> That will show if the test binary works and should report any failing
> tests.
> Don't forget to source the GMXRC file before trying, though!
>
> Cheers
>
> Paul
>
>
> On 23/10/2019 12:36, Dave M wrote:
> > Hi Paul,
> >
> > Thanks for the 'mpirun -n X gmx_mpi mdrun'. It works now.
> >
> > Regarding tests, I found the folder here build/tests/regressiontests
> > So I checked all the log using a simple script (searching keyword
> > 'Finished') and it shows that all the log files have Finished properly
> in
> > their corresponding folders. So log files do not say anything here.
> >
> > On Wed, Oct 23, 2019 at 3:07 AM Paul Bauer 
> wrote:
> >
> >> Hello Dave,
> >>
> >> You need to use mpirun -n (number of processes) gmx_mpi mdrun to use a
> MPI
> >> enabled build of GROMACS. This is what the error message tries to tell
> you,
> >> but we might need to improve on this.
> >>
> >> There should be a regressiontests folder somewhere in your build tree
> if it
> >> downloaded the tests correctly.
> >>
> >> Cheers
> >>
> >> Paul
> >>
> >> On Wed, 23 Oct 2019, 12:02 Dave M,  wrote:
> >>
> >>> Hi Paul,
> >>>
> >>> Thanks for your reply.
> >>> a) I just checked there is no tests/regressiontests, some other folder
> is
> >>> there test/sphysicalvalidation
> >>> There is no log file.
> >>> b) Regarding thread-mpi I think it is not installed because when I use
> >> some
> >>> command like this:
> >>>
> >>>
> >>> gmx_mpi mdrun -v -deffnm 03-run -rdd 2.0 -nt 2
> >>>
> >>> I get an error:
> >>>
> >>>
> >>> Fatal error:
> >>>
> >>> Setting the total number of threads is only supported with thread-MPI
> and
> >>> GROMACS was compiled without thread-MPI
> >>>
> >>> I think (please correct me) gmx_mpi is for external MPI  openMPI in my
> >> case
> >>> so I tried just 'gmx mdrun' (not gmx_mpi) but then it says command not
> >>> found. I am not sure what I missed in installation cmake flags.
> >>>
> >>> Dave
> >>>
> >>> On Wed, Oct 23, 2019 at 2:45 AM Paul Bauer 
> >> wrote:
>  Hello Dave,
> 
>  Did you have a look into the log files from the regression tests under
>  tests/regressiontests?
>  They might give us some insight into what is happening.
> 
>  The warning in respect to thread-MPI is harmless, it just tells you
> >> that
>  you are using real MPI instead of thread-MPI.
> 
>  Cheers
>  Paul
> 
>  On Wed, 23 Oct 2019, 07:36 Dave M,  wrote:
> 
> > Hi All,
> >
> > Any hints/help much appreciated why am getting regression tests
> >>> failure.
> > Also to mention I think thread-mpi was not installed as I got an
> >> error
> > saying "MPI is not compatible with thread-MPI. Disabling thread-MPI".
> >>> How
> > to check the compatibility?
> >
> > 

Re: [gmx-users] regression test errors

2019-10-23 Thread Paul bauer

Hello Dave,

this is weird, no idea why it didn't work then.
You can try running the test suite manually in the folder you found with

perl gmxtest.pl -mpirun mpirun -np X -noverbose

That will show if the test binary works and should report any failing tests.
Don't forget to source the GMXRC file before trying, though!

Cheers

Paul


On 23/10/2019 12:36, Dave M wrote:

Hi Paul,

Thanks for the 'mpirun -n X gmx_mpi mdrun'. It works now.

Regarding tests, I found the folder here build/tests/regressiontests
So I checked all the log using a simple script (searching keyword
'Finished') and it shows that all the log files have Finished properly  in
their corresponding folders. So log files do not say anything here.

On Wed, Oct 23, 2019 at 3:07 AM Paul Bauer  wrote:


Hello Dave,

You need to use mpirun -n (number of processes) gmx_mpi mdrun to use a MPI
enabled build of GROMACS. This is what the error message tries to tell you,
but we might need to improve on this.

There should be a regressiontests folder somewhere in your build tree if it
downloaded the tests correctly.

Cheers

Paul

On Wed, 23 Oct 2019, 12:02 Dave M,  wrote:


Hi Paul,

Thanks for your reply.
a) I just checked there is no tests/regressiontests, some other folder is
there test/sphysicalvalidation
There is no log file.
b) Regarding thread-mpi I think it is not installed because when I use

some

command like this:


gmx_mpi mdrun -v -deffnm 03-run -rdd 2.0 -nt 2

I get an error:


Fatal error:

Setting the total number of threads is only supported with thread-MPI and
GROMACS was compiled without thread-MPI

I think (please correct me) gmx_mpi is for external MPI  openMPI in my

case

so I tried just 'gmx mdrun' (not gmx_mpi) but then it says command not
found. I am not sure what I missed in installation cmake flags.

Dave

On Wed, Oct 23, 2019 at 2:45 AM Paul Bauer 

wrote:

Hello Dave,

Did you have a look into the log files from the regression tests under
tests/regressiontests?
They might give us some insight into what is happening.

The warning in respect to thread-MPI is harmless, it just tells you

that

you are using real MPI instead of thread-MPI.

Cheers
Paul

On Wed, 23 Oct 2019, 07:36 Dave M,  wrote:


Hi All,

Any hints/help much appreciated why am getting regression tests

failure.

Also to mention I think thread-mpi was not installed as I got an

error

saying "MPI is not compatible with thread-MPI. Disabling thread-MPI".

How

to check the compatibility?

Thanks.

best regards,
D

On Sun, Oct 20, 2019 at 2:58 AM Dave M 

wrote:

Hi All,

I am trying to install gromacs2019.4 with:
cmake ..  -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=on
-DGMX_MPI=on -DGMX_GPU=on
-DCMAKE_INSTALL_PREFIX=/usr/local/gromacs/gromacs2019_4
-DGMX_FFT_LIBRARY=fftw3 -DCMAKE_BUILD_TYPE=Debug

But 5 tests (41 to 46) were failed copied below:


The following tests FAILED:

41 - regressiontests/simple (Failed)

42 - regressiontests/complex (Failed)

43 - regressiontests/kernel (Failed)

44 - regressiontests/freeenergy (Failed)

45 - regressiontests/rotation (Failed)

Errors while running CTest

CMakeFiles/run-ctest-nophys.dir/build.make:57: recipe for target
'CMakeFiles/run-ctest-nophys' failed

make[3]: *** [CMakeFiles/run-ctest-nophys] Error 8

CMakeFiles/Makefile2:1392: recipe for target
'CMakeFiles/run-ctest-nophys.dir/all' failed

make[2]: *** [CMakeFiles/run-ctest-nophys.dir/all] Error 2

CMakeFiles/Makefile2:1172: recipe for target

'CMakeFiles/check.dir/rule'

failed

make[1]: *** [CMakeFiles/check.dir/rule] Error 2

Makefile:626: recipe for target 'check' failed
make: *** [check] Error 2

Not sure what could be wrong. Just to add I get some error/warning

during

installation which says "MPI is not compatible with thread-MPI.

Disabling

thread-MPI". I am using -DGMX_MPI=on and to have openMPI on ubuntu

18.04

I

used "sudo apt-get install openmpi-bin openmpi-common

libopenmpi-dev"

Please let me know how I can fix this.

best regards,
D



--
Gromacs Users mailing list

* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users

or

send a mail to gmx-users-requ...@gromacs.org.


--
Gromacs Users mailing list

* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
send a mail to gmx-users-requ...@gromacs.org.


--
Gromacs Users mailing list

* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit

Re: [gmx-users] regression test errors

2019-10-23 Thread Dave M
Hi Paul,

Thanks for the 'mpirun -n X gmx_mpi mdrun'. It works now.

Regarding tests, I found the folder here build/tests/regressiontests
So I checked all the log using a simple script (searching keyword
'Finished') and it shows that all the log files have Finished properly  in
their corresponding folders. So log files do not say anything here.

On Wed, Oct 23, 2019 at 3:07 AM Paul Bauer  wrote:

> Hello Dave,
>
> You need to use mpirun -n (number of processes) gmx_mpi mdrun to use a MPI
> enabled build of GROMACS. This is what the error message tries to tell you,
> but we might need to improve on this.
>
> There should be a regressiontests folder somewhere in your build tree if it
> downloaded the tests correctly.
>
> Cheers
>
> Paul
>
> On Wed, 23 Oct 2019, 12:02 Dave M,  wrote:
>
> > Hi Paul,
> >
> > Thanks for your reply.
> > a) I just checked there is no tests/regressiontests, some other folder is
> > there test/sphysicalvalidation
> > There is no log file.
> > b) Regarding thread-mpi I think it is not installed because when I use
> some
> > command like this:
> >
> >
> > gmx_mpi mdrun -v -deffnm 03-run -rdd 2.0 -nt 2
> >
> > I get an error:
> >
> >
> > Fatal error:
> >
> > Setting the total number of threads is only supported with thread-MPI and
> > GROMACS was compiled without thread-MPI
> >
> > I think (please correct me) gmx_mpi is for external MPI  openMPI in my
> case
> > so I tried just 'gmx mdrun' (not gmx_mpi) but then it says command not
> > found. I am not sure what I missed in installation cmake flags.
> >
> > Dave
> >
> > On Wed, Oct 23, 2019 at 2:45 AM Paul Bauer 
> wrote:
> >
> > > Hello Dave,
> > >
> > > Did you have a look into the log files from the regression tests under
> > > tests/regressiontests?
> > > They might give us some insight into what is happening.
> > >
> > > The warning in respect to thread-MPI is harmless, it just tells you
> that
> > > you are using real MPI instead of thread-MPI.
> > >
> > > Cheers
> > > Paul
> > >
> > > On Wed, 23 Oct 2019, 07:36 Dave M,  wrote:
> > >
> > > > Hi All,
> > > >
> > > > Any hints/help much appreciated why am getting regression tests
> > failure.
> > > > Also to mention I think thread-mpi was not installed as I got an
> error
> > > > saying "MPI is not compatible with thread-MPI. Disabling thread-MPI".
> > How
> > > > to check the compatibility?
> > > >
> > > > Thanks.
> > > >
> > > > best regards,
> > > > D
> > > >
> > > > On Sun, Oct 20, 2019 at 2:58 AM Dave M 
> wrote:
> > > >
> > > > > Hi All,
> > > > >
> > > > > I am trying to install gromacs2019.4 with:
> > > > > cmake ..  -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=on
> > > > > -DGMX_MPI=on -DGMX_GPU=on
> > > > > -DCMAKE_INSTALL_PREFIX=/usr/local/gromacs/gromacs2019_4
> > > > > -DGMX_FFT_LIBRARY=fftw3 -DCMAKE_BUILD_TYPE=Debug
> > > > >
> > > > > But 5 tests (41 to 46) were failed copied below:
> > > > >
> > > > >
> > > > > The following tests FAILED:
> > > > >
> > > > > 41 - regressiontests/simple (Failed)
> > > > >
> > > > > 42 - regressiontests/complex (Failed)
> > > > >
> > > > > 43 - regressiontests/kernel (Failed)
> > > > >
> > > > > 44 - regressiontests/freeenergy (Failed)
> > > > >
> > > > > 45 - regressiontests/rotation (Failed)
> > > > >
> > > > > Errors while running CTest
> > > > >
> > > > > CMakeFiles/run-ctest-nophys.dir/build.make:57: recipe for target
> > > > > 'CMakeFiles/run-ctest-nophys' failed
> > > > >
> > > > > make[3]: *** [CMakeFiles/run-ctest-nophys] Error 8
> > > > >
> > > > > CMakeFiles/Makefile2:1392: recipe for target
> > > > > 'CMakeFiles/run-ctest-nophys.dir/all' failed
> > > > >
> > > > > make[2]: *** [CMakeFiles/run-ctest-nophys.dir/all] Error 2
> > > > >
> > > > > CMakeFiles/Makefile2:1172: recipe for target
> > > 'CMakeFiles/check.dir/rule'
> > > > > failed
> > > > >
> > > > > make[1]: *** [CMakeFiles/check.dir/rule] Error 2
> > > > >
> > > > > Makefile:626: recipe for target 'check' failed
> > > > > make: *** [check] Error 2
> > > > >
> > > > > Not sure what could be wrong. Just to add I get some error/warning
> > > during
> > > > > installation which says "MPI is not compatible with thread-MPI.
> > > Disabling
> > > > > thread-MPI". I am using -DGMX_MPI=on and to have openMPI on ubuntu
> > > 18.04
> > > > I
> > > > > used "sudo apt-get install openmpi-bin openmpi-common
> libopenmpi-dev"
> > > > >
> > > > > Please let me know how I can fix this.
> > > > >
> > > > > best regards,
> > > > > D
> > > > >
> > > > >
> > > > --
> > > > Gromacs Users mailing list
> > > >
> > > > * Please search the archive at
> > > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > > > posting!
> > > >
> > > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > > >
> > > > * For (un)subscribe requests visit
> > > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
> or
> > > > send a mail to gmx-users-requ...@gromacs.org.
> > > >
> > > --
> > > Gromacs Users mailing list
> > >
> > > * Please search the 

Re: [gmx-users] regression test errors

2019-10-23 Thread Paul Bauer
Hello Dave,

You need to use mpirun -n (number of processes) gmx_mpi mdrun to use a MPI
enabled build of GROMACS. This is what the error message tries to tell you,
but we might need to improve on this.

There should be a regressiontests folder somewhere in your build tree if it
downloaded the tests correctly.

Cheers

Paul

On Wed, 23 Oct 2019, 12:02 Dave M,  wrote:

> Hi Paul,
>
> Thanks for your reply.
> a) I just checked there is no tests/regressiontests, some other folder is
> there test/sphysicalvalidation
> There is no log file.
> b) Regarding thread-mpi I think it is not installed because when I use some
> command like this:
>
>
> gmx_mpi mdrun -v -deffnm 03-run -rdd 2.0 -nt 2
>
> I get an error:
>
>
> Fatal error:
>
> Setting the total number of threads is only supported with thread-MPI and
> GROMACS was compiled without thread-MPI
>
> I think (please correct me) gmx_mpi is for external MPI  openMPI in my case
> so I tried just 'gmx mdrun' (not gmx_mpi) but then it says command not
> found. I am not sure what I missed in installation cmake flags.
>
> Dave
>
> On Wed, Oct 23, 2019 at 2:45 AM Paul Bauer  wrote:
>
> > Hello Dave,
> >
> > Did you have a look into the log files from the regression tests under
> > tests/regressiontests?
> > They might give us some insight into what is happening.
> >
> > The warning in respect to thread-MPI is harmless, it just tells you that
> > you are using real MPI instead of thread-MPI.
> >
> > Cheers
> > Paul
> >
> > On Wed, 23 Oct 2019, 07:36 Dave M,  wrote:
> >
> > > Hi All,
> > >
> > > Any hints/help much appreciated why am getting regression tests
> failure.
> > > Also to mention I think thread-mpi was not installed as I got an error
> > > saying "MPI is not compatible with thread-MPI. Disabling thread-MPI".
> How
> > > to check the compatibility?
> > >
> > > Thanks.
> > >
> > > best regards,
> > > D
> > >
> > > On Sun, Oct 20, 2019 at 2:58 AM Dave M  wrote:
> > >
> > > > Hi All,
> > > >
> > > > I am trying to install gromacs2019.4 with:
> > > > cmake ..  -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=on
> > > > -DGMX_MPI=on -DGMX_GPU=on
> > > > -DCMAKE_INSTALL_PREFIX=/usr/local/gromacs/gromacs2019_4
> > > > -DGMX_FFT_LIBRARY=fftw3 -DCMAKE_BUILD_TYPE=Debug
> > > >
> > > > But 5 tests (41 to 46) were failed copied below:
> > > >
> > > >
> > > > The following tests FAILED:
> > > >
> > > > 41 - regressiontests/simple (Failed)
> > > >
> > > > 42 - regressiontests/complex (Failed)
> > > >
> > > > 43 - regressiontests/kernel (Failed)
> > > >
> > > > 44 - regressiontests/freeenergy (Failed)
> > > >
> > > > 45 - regressiontests/rotation (Failed)
> > > >
> > > > Errors while running CTest
> > > >
> > > > CMakeFiles/run-ctest-nophys.dir/build.make:57: recipe for target
> > > > 'CMakeFiles/run-ctest-nophys' failed
> > > >
> > > > make[3]: *** [CMakeFiles/run-ctest-nophys] Error 8
> > > >
> > > > CMakeFiles/Makefile2:1392: recipe for target
> > > > 'CMakeFiles/run-ctest-nophys.dir/all' failed
> > > >
> > > > make[2]: *** [CMakeFiles/run-ctest-nophys.dir/all] Error 2
> > > >
> > > > CMakeFiles/Makefile2:1172: recipe for target
> > 'CMakeFiles/check.dir/rule'
> > > > failed
> > > >
> > > > make[1]: *** [CMakeFiles/check.dir/rule] Error 2
> > > >
> > > > Makefile:626: recipe for target 'check' failed
> > > > make: *** [check] Error 2
> > > >
> > > > Not sure what could be wrong. Just to add I get some error/warning
> > during
> > > > installation which says "MPI is not compatible with thread-MPI.
> > Disabling
> > > > thread-MPI". I am using -DGMX_MPI=on and to have openMPI on ubuntu
> > 18.04
> > > I
> > > > used "sudo apt-get install openmpi-bin openmpi-common libopenmpi-dev"
> > > >
> > > > Please let me know how I can fix this.
> > > >
> > > > best regards,
> > > > D
> > > >
> > > >
> > > --
> > > Gromacs Users mailing list
> > >
> > > * Please search the archive at
> > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > > posting!
> > >
> > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > >
> > > * For (un)subscribe requests visit
> > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > > send a mail to gmx-users-requ...@gromacs.org.
> > >
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-requ...@gromacs.org.
> >
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] regression test errors

2019-10-23 Thread Dave M
Hi Paul,

Thanks for your reply.
a) I just checked there is no tests/regressiontests, some other folder is
there test/sphysicalvalidation
There is no log file.
b) Regarding thread-mpi I think it is not installed because when I use some
command like this:


gmx_mpi mdrun -v -deffnm 03-run -rdd 2.0 -nt 2

I get an error:


Fatal error:

Setting the total number of threads is only supported with thread-MPI and
GROMACS was compiled without thread-MPI

I think (please correct me) gmx_mpi is for external MPI  openMPI in my case
so I tried just 'gmx mdrun' (not gmx_mpi) but then it says command not
found. I am not sure what I missed in installation cmake flags.

Dave

On Wed, Oct 23, 2019 at 2:45 AM Paul Bauer  wrote:

> Hello Dave,
>
> Did you have a look into the log files from the regression tests under
> tests/regressiontests?
> They might give us some insight into what is happening.
>
> The warning in respect to thread-MPI is harmless, it just tells you that
> you are using real MPI instead of thread-MPI.
>
> Cheers
> Paul
>
> On Wed, 23 Oct 2019, 07:36 Dave M,  wrote:
>
> > Hi All,
> >
> > Any hints/help much appreciated why am getting regression tests failure.
> > Also to mention I think thread-mpi was not installed as I got an error
> > saying "MPI is not compatible with thread-MPI. Disabling thread-MPI". How
> > to check the compatibility?
> >
> > Thanks.
> >
> > best regards,
> > D
> >
> > On Sun, Oct 20, 2019 at 2:58 AM Dave M  wrote:
> >
> > > Hi All,
> > >
> > > I am trying to install gromacs2019.4 with:
> > > cmake ..  -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=on
> > > -DGMX_MPI=on -DGMX_GPU=on
> > > -DCMAKE_INSTALL_PREFIX=/usr/local/gromacs/gromacs2019_4
> > > -DGMX_FFT_LIBRARY=fftw3 -DCMAKE_BUILD_TYPE=Debug
> > >
> > > But 5 tests (41 to 46) were failed copied below:
> > >
> > >
> > > The following tests FAILED:
> > >
> > > 41 - regressiontests/simple (Failed)
> > >
> > > 42 - regressiontests/complex (Failed)
> > >
> > > 43 - regressiontests/kernel (Failed)
> > >
> > > 44 - regressiontests/freeenergy (Failed)
> > >
> > > 45 - regressiontests/rotation (Failed)
> > >
> > > Errors while running CTest
> > >
> > > CMakeFiles/run-ctest-nophys.dir/build.make:57: recipe for target
> > > 'CMakeFiles/run-ctest-nophys' failed
> > >
> > > make[3]: *** [CMakeFiles/run-ctest-nophys] Error 8
> > >
> > > CMakeFiles/Makefile2:1392: recipe for target
> > > 'CMakeFiles/run-ctest-nophys.dir/all' failed
> > >
> > > make[2]: *** [CMakeFiles/run-ctest-nophys.dir/all] Error 2
> > >
> > > CMakeFiles/Makefile2:1172: recipe for target
> 'CMakeFiles/check.dir/rule'
> > > failed
> > >
> > > make[1]: *** [CMakeFiles/check.dir/rule] Error 2
> > >
> > > Makefile:626: recipe for target 'check' failed
> > > make: *** [check] Error 2
> > >
> > > Not sure what could be wrong. Just to add I get some error/warning
> during
> > > installation which says "MPI is not compatible with thread-MPI.
> Disabling
> > > thread-MPI". I am using -DGMX_MPI=on and to have openMPI on ubuntu
> 18.04
> > I
> > > used "sudo apt-get install openmpi-bin openmpi-common libopenmpi-dev"
> > >
> > > Please let me know how I can fix this.
> > >
> > > best regards,
> > > D
> > >
> > >
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-requ...@gromacs.org.
> >
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] regression test errors

2019-10-23 Thread Paul Bauer
Hello Dave,

Did you have a look into the log files from the regression tests under
tests/regressiontests?
They might give us some insight into what is happening.

The warning in respect to thread-MPI is harmless, it just tells you that
you are using real MPI instead of thread-MPI.

Cheers
Paul

On Wed, 23 Oct 2019, 07:36 Dave M,  wrote:

> Hi All,
>
> Any hints/help much appreciated why am getting regression tests failure.
> Also to mention I think thread-mpi was not installed as I got an error
> saying "MPI is not compatible with thread-MPI. Disabling thread-MPI". How
> to check the compatibility?
>
> Thanks.
>
> best regards,
> D
>
> On Sun, Oct 20, 2019 at 2:58 AM Dave M  wrote:
>
> > Hi All,
> >
> > I am trying to install gromacs2019.4 with:
> > cmake ..  -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=on
> > -DGMX_MPI=on -DGMX_GPU=on
> > -DCMAKE_INSTALL_PREFIX=/usr/local/gromacs/gromacs2019_4
> > -DGMX_FFT_LIBRARY=fftw3 -DCMAKE_BUILD_TYPE=Debug
> >
> > But 5 tests (41 to 46) were failed copied below:
> >
> >
> > The following tests FAILED:
> >
> > 41 - regressiontests/simple (Failed)
> >
> > 42 - regressiontests/complex (Failed)
> >
> > 43 - regressiontests/kernel (Failed)
> >
> > 44 - regressiontests/freeenergy (Failed)
> >
> > 45 - regressiontests/rotation (Failed)
> >
> > Errors while running CTest
> >
> > CMakeFiles/run-ctest-nophys.dir/build.make:57: recipe for target
> > 'CMakeFiles/run-ctest-nophys' failed
> >
> > make[3]: *** [CMakeFiles/run-ctest-nophys] Error 8
> >
> > CMakeFiles/Makefile2:1392: recipe for target
> > 'CMakeFiles/run-ctest-nophys.dir/all' failed
> >
> > make[2]: *** [CMakeFiles/run-ctest-nophys.dir/all] Error 2
> >
> > CMakeFiles/Makefile2:1172: recipe for target 'CMakeFiles/check.dir/rule'
> > failed
> >
> > make[1]: *** [CMakeFiles/check.dir/rule] Error 2
> >
> > Makefile:626: recipe for target 'check' failed
> > make: *** [check] Error 2
> >
> > Not sure what could be wrong. Just to add I get some error/warning during
> > installation which says "MPI is not compatible with thread-MPI. Disabling
> > thread-MPI". I am using -DGMX_MPI=on and to have openMPI on ubuntu 18.04
> I
> > used "sudo apt-get install openmpi-bin openmpi-common libopenmpi-dev"
> >
> > Please let me know how I can fix this.
> >
> > best regards,
> > D
> >
> >
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Solvation of protein on membrane surface

2019-10-23 Thread Olga Press
Thanks a lot for your help.

‫בתאריך יום ב׳, 21 באוק׳ 2019 ב-19:51 מאת ‪Justin Lemkul‬‏ <‪jalem...@vt.edu
‬‏>:‬

>
>
> On 10/20/19 5:57 AM, Olga Press wrote:
> > Prof. Lemkul,
> > Thank you very much for your reply. I would be very grateful if you can
> > help me with some questions regarding voids compression.
> > Should I run the NPT equilibration on the whole system( including
> > protein+membrame+solvent +*ions*) meaning, should I continue my protocol
> > (adding ions-->NVT equilibration with position restraints on the
> > protein---> and than NPT equilibration with position restraints---> NPT
> > equilibration without restraints) or should I run first NPT
> > equilibration without position restraints (and for how long?) and than
> > continue the protocol?
>
> A normal protocol with restraints on your protein should work fine.
> Those voids will be gone within tens of picoseconds.
>
> -Justin
>
> --
> ==
>
> Justin A. Lemkul, Ph.D.
> Assistant Professor
> Office: 301 Fralin Hall
> Lab: 303 Engel Hall
>
> Virginia Tech Department of Biochemistry
> 340 West Campus Dr.
> Blacksburg, VA 24061
>
> jalem...@vt.edu | (540) 231-3129
> http://www.thelemkullab.com
>
> ==
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>


-- 
*Olga Press-Sandler*
Ph.D. student, Yifat Miller's lab
Department of Chemistry
Ben-Gurion University, Israel
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.