[gmx-users] Hydrophobic interactions and electrostatic interactions between the protein chains

2019-03-11 Thread Mahboobeh Eslami
Hi dear GROMACS usersI want to calculate  hydrophobic interactions and 
electrostatic interactions between the protein chains using gromacs trajectory 
. Please guide me.Thanks a lot
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] VMD movie making-remove water molecules from .trr after a simulation

2019-03-11 Thread mary ko
 Thank you Justin. Is it possible to choose protein and ligand separately (e.g. 
1|13) in the trjconv, since I chose protein-ligand subgroup which was taken as 
a whole and ligand can not be distinguished individually ? Or should I do any 
modifications in VMD ? 
On Monday, March 11, 2019, 6:32:22 PM EDT, Justin Lemkul  
wrote:  
 
 

On 3/11/19 6:01 PM, mary ko wrote:
> Dear all
> How can I make a movie from a protein-ligand simulation in VMD? I opened 
> complex.pdb and md-fit.trr and  the movie is shown but due to the large 
> number of water molecules it freezes after some frames. How can I remove 
> solvent molecules from the md-fit.trr? Any other suggestions how to make the 
> movie without freezing?

Use trjconv to save whatever subset of coordinates you want.

-Justin

-- 
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.  
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] gmx trjconv trr xtc

2019-03-11 Thread Dallas Warren
.trr and .xtc are two different file formats, with the latter being
compressed.  Therefore, an .xtc file will be significantly smaller than a
.trr

Catch ya,

Dr. Dallas Warren
Drug Delivery, Disposition and Dynamics
Monash Institute of Pharmaceutical Sciences, Monash University
381 Royal Parade, Parkville VIC 3052
dallas.war...@monash.edu
-
When the only tool you own is a hammer, every problem begins to resemble a
nail.


On Tue, 12 Mar 2019 at 10:56, Alex  wrote:

> Dear groamcs user,
> A system of mine contains two molecule type of A and B in water. Using
> gmx trjconv -f out.xtc -o out.last.5ns.trr I  first truncated the last 5ns
> of the system's XTC file as a TRR file and just selected the non-water
> contents so that the TRR file only has A and B. The TRR file is 4.4 GB.
> Then I applied again the "gmx trjconv -pbc cluster -center yes -f
> out.last.5ns.trr -o out.last.5ns.xtc" and again just non-water was chosen.
> I was expecting that the both out.last.5ns.trr and out.last.5ns.xtc would
> have the same size 4.4 GB as the material in both of them are just A and B,
> however, the out.last.5ns.xtc is just 1.4 GB!
> I wonder if that is normal? here I just care about the coordinates
> dynamically and not velocity, do you think I am loosing any coordinate in
> by the second command from out.last.5ns.trr to out.last.5ns.xtc?
>
> Regards
> Alex
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] gmx trjconv trr xtc

2019-03-11 Thread Alex
Dear groamcs user,
A system of mine contains two molecule type of A and B in water. Using
gmx trjconv -f out.xtc -o out.last.5ns.trr I  first truncated the last 5ns
of the system's XTC file as a TRR file and just selected the non-water
contents so that the TRR file only has A and B. The TRR file is 4.4 GB.
Then I applied again the "gmx trjconv -pbc cluster -center yes -f
out.last.5ns.trr -o out.last.5ns.xtc" and again just non-water was chosen.
I was expecting that the both out.last.5ns.trr and out.last.5ns.xtc would
have the same size 4.4 GB as the material in both of them are just A and B,
however, the out.last.5ns.xtc is just 1.4 GB!
I wonder if that is normal? here I just care about the coordinates
dynamically and not velocity, do you think I am loosing any coordinate in
by the second command from out.last.5ns.trr to out.last.5ns.xtc?

Regards
Alex
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] VMD movie making-remove water molecules from .trr after a simulation

2019-03-11 Thread Justin Lemkul



On 3/11/19 6:01 PM, mary ko wrote:

Dear all
How can I make a movie from a protein-ligand simulation in VMD? I opened 
complex.pdb and md-fit.trr and  the movie is shown but due to the large number 
of water molecules it freezes after some frames. How can I remove solvent 
molecules from the md-fit.trr? Any other suggestions how to make the movie 
without freezing?


Use trjconv to save whatever subset of coordinates you want.

-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] gromacs error in vacuum preparation simulation

2019-03-11 Thread Justin Lemkul




On 3/11/19 5:45 PM, Mario Andres Rodriguez Pineda wrote:

If i don't use -maxwarn option they send me the same error.


Omitting -maxwarn won't fix problems, but it is a very bad habit to 
casually use -maxwarn as it overrides critical problems with your input.



I see also this:
WARNING 1 [file topol.top, line 23986]:
   You are using Ewald electrostatics in a system with net charge. This can
   lead to severe artifacts, such as ions moving into regions with low
   dielectric, due to the uniform background charge. We suggest to
   neutralize your system with counter ions, possibly in combination with a
   physiological salt concentration.
Cheking the archive toppol.top this line is empty

I'm try to minimize the energy of my protein in vacum, whithout ions or
solvent


As the message says, you shouldn't be using PME for this system. In 
vacuum, you should be using plain cutoff electrostatics with infinite 
cutoffs (rlist=rcoulomb=rvdw=0).


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] VMD movie making-remove water molecules from .trr after a simulation

2019-03-11 Thread mary ko
Dear all
How can I make a movie from a protein-ligand simulation in VMD? I opened 
complex.pdb and md-fit.trr and  the movie is shown but due to the large number 
of water molecules it freezes after some frames. How can I remove solvent 
molecules from the md-fit.trr? Any other suggestions how to make the movie 
without freezing?
Thank you.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] gromacs error in vacuum preparation simulation

2019-03-11 Thread Mario Andres Rodriguez Pineda
If i don't use -maxwarn option they send me the same error.
I see also this:
WARNING 1 [file topol.top, line 23986]:
  You are using Ewald electrostatics in a system with net charge. This can
  lead to severe artifacts, such as ions moving into regions with low
  dielectric, due to the uniform background charge. We suggest to
  neutralize your system with counter ions, possibly in combination with a
  physiological salt concentration.
Cheking the archive toppol.top this line is empty

I'm try to minimize the energy of my protein in vacum, whithout ions or
solvent

Em seg, 11 de mar de 2019 às 17:12, Justin Lemkul 
escreveu:

>
>
> On 3/11/19 3:20 PM, Mario Andres Rodriguez Pineda wrote:
> > Hi everybody
> > I want to do one dynamic simulation of one protein
> > i'm try to minimize the protein in vacum before the simulation runing but
> > gromacs send me this error:
> > ---
> > Program: gmx grompp, version 2018.6
> > Source file: src/gromacs/gmxpreprocess/grompp.cpp (line 1991)
> >
> > Fatal error:
> > There was 1 error in input file(s)
> >
> > For more information and tips for troubleshooting, please check the
> GROMACS
> > website at http://www.gromacs.org/Documentation/Errors
> > ---
> >
> --
> > MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> > with errorcode 1.
> >
> > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> > You may or may not see output from other processes, depending on
> > exactly when Open MPI kills them.
> >
> --
> > this is the command that i use for minimization in vacum.
> > gmx_mpi grompp -f vacuum.mdp -c cbd212_box.gro -p topol.top -o
> > cbd212_vac.tpr -maxwarn 1
>
> Run grompp interactively if you want to see the actual error. It may
> also be printed to a log file from your queuing system. Also, never use
> -maxwarn.
>
> -Justin
>
> --
> ==
>
> Justin A. Lemkul, Ph.D.
> Assistant Professor
> Office: 301 Fralin Hall
> Lab: 303 Engel Hall
>
> Virginia Tech Department of Biochemistry
> 340 West Campus Dr.
> Blacksburg, VA 24061
>
> jalem...@vt.edu | (540) 231-3129
> http://www.thelemkullab.com
>
> ==
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>


-- 
*MSc. MARIO ANDRÉS RODRÍGUEZ PINEDA*
*Estudiante Doctorado en Biotecnología*

*UNAL- MEDELLÍN/ IQ- USP*

*Grupo de Pesquisa em Ressonância Magnética Nuclear de Biomoléculas *
*Av. Prof. Lineu Prestes 748, Sao Paulo SP, 05508-000, Tel: + 55 11 3091
1475*
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] gromacs error in vacuum preparation simulation

2019-03-11 Thread Justin Lemkul




On 3/11/19 3:20 PM, Mario Andres Rodriguez Pineda wrote:

Hi everybody
I want to do one dynamic simulation of one protein
i'm try to minimize the protein in vacum before the simulation runing but
gromacs send me this error:
---
Program: gmx grompp, version 2018.6
Source file: src/gromacs/gmxpreprocess/grompp.cpp (line 1991)

Fatal error:
There was 1 error in input file(s)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
---
--
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--
this is the command that i use for minimization in vacum.
gmx_mpi grompp -f vacuum.mdp -c cbd212_box.gro -p topol.top -o
cbd212_vac.tpr -maxwarn 1


Run grompp interactively if you want to see the actual error. It may 
also be printed to a log file from your queuing system. Also, never use 
-maxwarn.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] gromacs error in vacuum preparation simulation

2019-03-11 Thread Mario Andres Rodriguez Pineda
Hi everybody
I want to do one dynamic simulation of one protein
i'm try to minimize the protein in vacum before the simulation runing but
gromacs send me this error:
---
Program: gmx grompp, version 2018.6
Source file: src/gromacs/gmxpreprocess/grompp.cpp (line 1991)

Fatal error:
There was 1 error in input file(s)

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
---
--
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--
this is the command that i use for minimization in vacum.
gmx_mpi grompp -f vacuum.mdp -c cbd212_box.gro -p topol.top -o
cbd212_vac.tpr -maxwarn 1


I can't find the solution for this problem.

Thanks for your help
-- 
*MSc. MARIO ANDRÉS RODRÍGUEZ PINEDA*
*Estudiante Doctorado en Biotecnología*

*UNAL- MEDELLÍN/ IQ- USP*

*Grupo de Pesquisa em Ressonância Magnética Nuclear de Biomoléculas *
*Av. Prof. Lineu Prestes 748, Sao Paulo SP, 05508-000, Tel: + 55 11 3091
1475*
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] grompp is using a very large amount of memory on a modestly-sized system

2019-03-11 Thread Sean Marks
Hi, Mark,

I'd be happy to, as soon as I get a chance.

I know very little about how GROMACS works internally, but I had a few
ideas I wanted to share in the hopes that they might help. One is that
pairwise parameters for electrostatics could be implemented in the same way
that LJ parameters are stored. That would provide a tremendous amount of
flexibility for use cases far beyond my own. Or there could simply be an
FEP flag for only scaling between molecule types, rather than also between
molecules of a given type. Again, you guys are the experts and I know you
have other priorities. Just my thoughts.

Best,
Sean

On Fri, Mar 8, 2019 at 3:44 PM Mark Abraham 
wrote:

> Hi,
>
> I don't have a solution for the question at hand, but it'd be great to have
> your inputs attached to a new issue at https://redmine.gromacs.org, so
> that
> we can have such an input case to test with, so that we can improve the
> simplistic implementation! Please upload it if you can.
>
> Mark
>
> On Fri., 8 Mar. 2019, 19:24 Sean Marks,  wrote:
>
> > Scratch that comment about sparseness. I am short on sleep, and for a
> > moment thought I was talking about constraints, not electrostatics.
> >
> > On Fri, Mar 8, 2019 at 1:12 PM Sean Marks 
> wrote:
> >
> > > I understand now, thank you for the prompt response. While the matrix
> > > would actually be quite sparse (since the constraints are localized to
> > each
> > > ice molecule), I take it that memory is being allocated for a dense
> > matrix.
> > >
> > > That aside, is it feasible to accomplish my stated goal of scaling
> > > ice-water electrostatics while leaving other interactions unaffected?
> One
> > > alternative I considered was manually scaling down the charges
> > themselves,
> > > but doing this causes the lattice to lose its form.
> > >
> > > On Fri, Mar 8, 2019 at 12:28 PM Justin Lemkul  wrote:
> > >
> > >>
> > >>
> > >> On 3/8/19 11:04 AM, Sean Marks wrote:
> > >> > Hi, everyone,
> > >> >
> > >> > I am running into an issue where grompp is using a tremendous amount
> > of
> > >> > memory and crashing, even though my system is not especially large
> > >> (63976
> > >> > atoms).
> > >> >
> > >> > I am using GROMACS 2016.3.
> > >> >
> > >> > My system consists of liquid water (7,930 molecules) next to a block
> > of
> > >> ice
> > >> > (8,094 molecules). The ice oxygens are restrained to their lattice
> > >> position
> > >> > with a harmonic potential with strength k = 4,000 kJ/mol/nm^2. I am
> > >> using
> > >> > the TIP4P/Ice model, which is a rigid 4-site model with a negative
> > >> partial
> > >> > charge located on a virtual site rather than the oxygen.
> > >> >
> > >> > My goal is to systematically reduce the electrostatic interactions
> > >> between
> > >> > the water molecules and the position-restrained ice, while leaving
> > >> > water-water and ice-ice interactions unaffected.
> > >> >
> > >> > To accomplish this, I am modeling all of the ice molecules using a
> > >> single
> > >> > moleculetype so that I can take advantages of GROMACS' FEP features
> to
> > >> > selectively scale interactions. I explicitly specify all constraints
> > and
> > >> > exclusions in the topology file. This moleculetype contains one
> > virtual
> > >> > site, 3 constraints, and 4 exclusions per "residue" (ice molecule).
> > >> >
> > >> > When I run grompp, I receive the following error, which I think
> means
> > >> that
> > >> > a huge block of memory (~9 GB) was requested but could not be
> > allocated:
> > >> >
> > >> > =
> > >> > Command line:
> > >> >gmx grompp -f npt.mdp -c md.gro -p topol.top -n index.ndx -r
> > >> > initconf_packmol.gro -o input.tpr -maxwarn 2 -pp processed.top
> > >> >
> > >> > ...
> > >> >
> > >> > Generated 21 of the 21 non-bonded parameter combinations
> > >> > Generating 1-4 interactions: fudge = 0.5
> > >> > Generated 21 of the 21 1-4 parameter combinations
> > >> > Excluding 3 bonded neighbours molecule type 'ICE'
> > >> > turning H bonds into constraints...
> > >> > Excluding 3 bonded neighbours molecule type 'SOL'
> > >> > turning H bonds into constraints...
> > >> > Coupling 1 copies of molecule type 'ICE'
> > >> > Setting gen_seed to 1021640799
> > >> > Velocities were taken from a Maxwell distribution at 273 K
> > >> > Cleaning up constraints and constant bonded interactions with
> virtual
> > >> sites
> > >> > Removing all charge groups because cutoff-scheme=Verlet
> > >> >
> > >> > ---
> > >> > Program: gmx grompp, version 2016.3
> > >> > Source file: src/gromacs/utility/smalloc.cpp (line 226)
> > >> >
> > >> > Fatal error:
> > >> > Not enough memory. Failed to realloc -8589934588 bytes for
> il->iatoms,
> > >> > il->iatoms=25e55010
> > >> > (called from file
> > >> >
> > >>
> >
> /home/semarks/source/gromacs/2016.3/icc/serial/gromacs-2016.3/src/gromacs/
> > >> > gmxpreprocess/convparm.cpp,
> > >> > line 565)
> > >> >
> > >> > For more information and tips for troubleshooting, 

Re: [gmx-users] gmx analyze -av

2019-03-11 Thread Justin Lemkul




On 3/11/19 2:08 PM, Amin Rouy wrote:

Hi,

I notice that the flag ''-av'' in  gmx analyze does not give the average
value of the set, rather the data file itself.


As it should. From gmx help analyze:

"Option -av produces the average over the sets."

If you want the average of different sets, just use gmx analyze -f.

-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] gmx analyze -av

2019-03-11 Thread Amin Rouy
Hi,

I notice that the flag ''-av'' in  gmx analyze does not give the average
value of the set, rather the data file itself.
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Simulation crashed - Large VCM, Pressure scaling more than 1%, Bond length not finite

2019-03-11 Thread zeineb SI CHAIB


Thanks a lot, Kevin for your comments and help.

Zeineb
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Build a .top file

2019-03-11 Thread monia kam
Hello,

So i'm starting to learn about MD simulations and i've done the tutorial of
Lysozyme in Water for Gromacs, my work is to do MD simulations for sugars
using the tool do_glycans and the force field Glycam06, however i don't
know how to generate the .top file?
i would appreciate it if you could help me with that

-- 
*Monia KAMLY,*

*Ingénieur en Chimie Analytique et Instrumentation,*
*Etudiante Master 2 Chimie, Spécialité, Agro-ressources Biomolécules et
Innovations*
*Faculté des Sciences et Techniques, Université de Limoges.*

*123 Avenue Albert Thomas, 87000 Limoges*
*Mobile: 06 99 78 86 29*
*Email: monia.ka...@etu.unilim.fr *
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

[gmx-users] pinoffset w LINCS error

2019-03-11 Thread Tamas Hegedus

Hi,

I have two servers with AMD2950X (16 physical cores). One with 16Gb RAM 
and RTX2080Ti, the other 32Gb RAM and GTX1080Ti.

I use gromacs 2018.3, compiled on the same way on both of the servers.

I wanted to run two simulations on the same host (8+8cores and GPU0 and 
GPU1). They are OK on the 32GbRAM and GTX1080Ti computer.


* One job with pinoffset fails on the 16Gb computer with LINCS errors. 
It runs without problem without pinoffset.


* If I run only a single job but with pinoffset then it also fails. The 
error in case: cannot settle water molecule


These indicate for me that memory, file naming, and setting parameters 
(pinoffset, gpuid) are OK and some magical problem exists. Do you have 
any suggestion?


Thanks, Tamas

--
Tamas Hegedus, PhD
Senior Research Fellow
Department of Biophysics and Radiation Biology
Semmelweis University | phone: (36) 1-459 1500/60233
Tuzolto utca 37-47| mailto:ta...@hegelab.org
Budapest, 1094, Hungary   | http://www.hegelab.org
--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Gomacs 2019 build on sles12 and centos

2019-03-11 Thread Nelson Chris AWE
Hi All,
I've built Gromacs 2019 on both a CentOS 7 and SLES12 machine.
Built using gcc@7.2.0
Dependencies:
fftw@3.3.8
openmpi@3.1.3

When running "make check" on both machines, I'm getting the same timeout error 
for test 29 - see below for extract and attached for full test output
anyone got any ideas?

Thanks,
Chris.

===
29/40 Test #29: GmxPreprocessTests ...***Timeout  30.15 sec
[==] Running 26 tests from 4 test cases.
[--] Global test environment set-up.
[--] 4 tests from GenconfTest
[ RUN  ] GenconfTest.nbox_Works
[   OK ] GenconfTest.nbox_Works (186 ms)
[ RUN  ] GenconfTest.nbox_norenumber_Works
[   OK ] GenconfTest.nbox_norenumber_Works (92 ms)
[ RUN  ] GenconfTest.nbox_dist_Works
[   OK ] GenconfTest.nbox_dist_Works (372 ms)
[ RUN  ] GenconfTest.nbox_rot_Works
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
center of geometry: 1.733667, 1.477000, 0.905167
[   OK ] GenconfTest.nbox_rot_Works (471 ms)
[--] 4 tests from GenconfTest (1121 ms total)

[--] 5 tests from InsertMoleculesTest
[ RUN  ] InsertMoleculesTest.InsertsMoleculesIntoExistingConfiguration
Reading solute configuration
Reading molecule configuration
Initialising inter-atomic distances...

WARNING: Masses and atomic (Van der Waals) radii will be guessed
 based on residue and atom names, since they could not be
 definitively assigned from the information in your input
 files. These guessed numbers might deviate from the mass
 and radius of the atom type. Please check the output
 files if necessary.

The information in this email and in any attachment(s) is commercial in 
confidence. If you are not the named addressee(s) or if you receive this email 
in error then any distribution, copying or use of this communication or the 
information in it is strictly prohibited. Please notify us immediately by email 
at admin.internet(at)awe.co.uk, and then delete this message from your 
computer. While attachments are virus checked, AWE plc does not accept any 
liability in respect of any virus which is not detected. AWE Plc Registered in 
England and Wales Registration No 02763902 AWE, Aldermaston, Reading, RG7 4PR
sprucea5:spack-build>make check
[100%] Running all tests except physical validation
/scratch/prod-spruce/acs/spack/opt/linux-sles12-x86_64/gcc-7.2.0/cmake-3.13.4-ocbi3jb2k4zbyt2lx7yvrm2ysb65nix6/bin/ctest
 --output-on-failure -E physicalvalidationtests
Test project 
/ROWAN/group/pts/spack/dev-2019-02-19/var/spack/stage/gromacs-2019-55qcdjpc4twm5abmvt4kghka3eanxdc6/gromacs-2019/spack-build
  Start  1: TestUtilsUnitTests
 1/40 Test  #1: TestUtilsUnitTests ...   Passed1.99 sec
  Start  2: TestUtilsMpiUnitTests
 2/40 Test  #2: TestUtilsMpiUnitTests    Passed0.41 sec
  Start  3: MdlibUnitTest
 3/40 Test  #3: MdlibUnitTest    Passed0.39 sec
  Start  4: AppliedForcesUnitTest
 4/40 Test  #4: AppliedForcesUnitTest    Passed2.24 sec
  Start  5: ListedForcesTest
 5/40 Test  #5: ListedForcesTest .   Passed1.29 sec
  Start  6: CommandLineUnitTests
 6/40 Test  #6: CommandLineUnitTests .   Passed0.42 sec
  Start  7: DomDecTests
 7/40 Test  #7: DomDecTests ..   Passed0.24 sec
  Start  8: EwaldUnitTests
 8/40 Test  #8: EwaldUnitTests ...   Passed9.78 sec
  Start  9: FFTUnitTests
 9/40 Test  #9: FFTUnitTests .   Passed0.48 sec
  Start 10: GpuUtilsUnitTests
10/40 Test #10: GpuUtilsUnitTests    Passed0.23 sec
  Start 11: HardwareUnitTests
11/40 Test #11: HardwareUnitTests    Passed0.24 sec
  Start 12: MathUnitTests
12/40 Test #12: MathUnitTests    Passed4.02 sec
  Start 13: MdrunUtilityUnitTests
13/40 Test #13: MdrunUtilityUnitTests    Passed0.25 sec
  Start 14: MdrunUtilityMpiUnitTests
14/40 Test #14: MdrunUtilityMpiUnitTests .   Passed0.27 sec
  Start 15: OnlineHelpUnitTests
15/40 Test #15: OnlineHelpUnitTests ..   Passed0.53 sec
  Start 16: OptionsUnitTests
16/40 Test #16: OptionsUnitTests .   Passed2.04 sec
  Start 17: RandomUnitTests
17/40 Test #17: RandomUnitTests ..   Passed0.50 sec
  Start