[gmx-users] Atomtype opls_236 not found

2019-01-31 Thread Anjali Patel
Hello users, I know there are lots of discussion on this topic and i have gone through all those even whole chapter 5 also. but didn’t get any solution or where i am wrong. i want to generate .tpr file. i am using gmx grompp -f ions.mdp -c solv.gro -p topol.top -o ions.tpr. in that i have

Re: [gmx-users] Make check not passing tests on 2018.5

2019-01-31 Thread Schulz, Roland
Hi, What compiler are you using? Please also paste the output of "gmx -version". Roland > -Original Message- > From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se > [mailto:gromacs.org_gmx-users-boun...@maillist.sys.kth.se] On Behalf Of > David Lister > Sent: Thursday, January 31,

[gmx-users] Make check not passing tests on 2018.5

2019-01-31 Thread David Lister
Hello, I've compiled gromacs 2018.5 in double precision a couple times now and it keeps on failing the same tests every time. This is on Ubuntu 18.04 with an i9 7900X. The cmake I used was: cmake .. -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=on -DGMX_BUILD_OWN_FFTW=ON -DCMAKE_BUILD_TYPE=Release

Re: [gmx-users] info about gpus

2019-01-31 Thread Moir, Michael (MMoir)
Sorry, that's 32 GB of RAM. I'm old, a MB used to be a lot of memory! Mike -Original Message- From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se On Behalf Of Moir, Michael (MMoir) Sent: Thursday, January 31, 2019 6:29 PM To: 'gmx-us...@gromacs.org' Subject: [**EXTERNAL**] Re:

Re: [gmx-users] info about gpus

2019-01-31 Thread Moir, Michael (MMoir)
Stephano, With a motherboard that doesn't split the PCI-E bandwidth, I get a 20% improvement in computation speed with 2 GPUs. Whether or not you think this is worth the extra $, I leave that up to you! My system is: i9-9900K, ASUS WS Z390 Pro motherboard, 2x 1070ti GPUs, 32 MB 3200 MHz

[gmx-users] Calculating bonded and non-bonded energy for a set of contiguous atoms

2019-01-31 Thread Ashraya Ravikumar
Hi, I have the simulation trajectory of a protein. I want to examine the bonded as well as non-bonded energy for a contiguous set of backbone atoms in the protein, say starting from C-alpha atom of residue i to C-alpha atom of residue i+2. I saw that there is an option in the mdp file called

Re: [gmx-users] info about gpus

2019-01-31 Thread Moir, Michael (MMoir)
Stefano, I'm investigating that myself. If your motherboard splits the bandwidth between your PCI-E slots then there is no advantage to having 2 GPUs for a GPU like the 1080. I have just upgraded to a better motherboard that does not split the bandwidth but I am not finished my testing.

[gmx-users] calculate potential energy or short-range and long-range energy and enthalpy per residue

2019-01-31 Thread milad bagheri
I performed MD for apoprotein after this I want to calculate potential energy or short-range and long-range energy and enthalpy "per residue" please help me how can I do this? -- Gromacs Users mailing list * Please search the archive at

[gmx-users] info about gpus

2019-01-31 Thread Stefano Guglielmo
Dear all, I am tryin to set a new workstation and I would like to know if there is a significant improvement in performance with two gpus (gtx 1080 ti or rtx 2080) rather than just one, and eventually with which cpu/ram requisite. Thanks in advance for any advice and suggestions Stefano --

Re: [gmx-users] WG: Issue with CUDA and gromacs

2019-01-31 Thread Szilárd Páll
On Thu, Jan 31, 2019 at 2:14 PM Szilárd Páll wrote: > > On Wed, Jan 30, 2019 at 5:15 PM Tafelmeier, Stefanie > wrote: > > > > Dear all, > > > > We are facing an issue with the CUDA toolkit. > > We tried several combinations of gromacs versions and CUDA Toolkits. No > > Toolkit older than 9.2

[gmx-users] methods of installing NVIDIA display drivers [forked from Re: Gromacs 2018.5 with CUDA]

2019-01-31 Thread Szilárd Páll
On Thu, Jan 31, 2019 at 3:18 PM wrote: > > > > -Original Message- > From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se > On Behalf Of Szilárd Páll > Sent: Thursday, January 31, 2019 7:06 AM > To: Discussion list for GROMACS users > Subject: Re: [gmx-users] Gromacs 2018.5 with CUDA

Re: [gmx-users] Gromacs 2018.5 with CUDA

2019-01-31 Thread pbuscemi
-Original Message- From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se On Behalf Of Szilárd Páll Sent: Thursday, January 31, 2019 7:06 AM To: Discussion list for GROMACS users Subject: Re: [gmx-users] Gromacs 2018.5 with CUDA On Wed, Jan 30, 2019 at 5:14 PM wrote: > > Vlad, > >

Re: [gmx-users] (no subject)

2019-01-31 Thread pbuscemi
Run the tutorials -Original Message- From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se On Behalf Of Satya Ranjan Sahoo Sent: Wednesday, January 30, 2019 11:19 PM To: gmx-us...@gromacs.org Subject: [gmx-users] (no subject) Sir, I am a beginner to GROMACS. I was unable to understand

Re: [gmx-users] About fprintf and debugging

2019-01-31 Thread Szilárd Páll
gmx mdrun -debug N where N is the debug level, can be 1/2. -- Szilárd On Mon, Jan 28, 2019 at 3:49 PM Mahmood Naderan wrote: > > Hi > Where should I set the flag in order to see the fprintf statements like > if (debug) > { > fprintf(debug, "PME: number of ranks =

Re: [gmx-users] WG: Issue with CUDA and gromacs

2019-01-31 Thread Szilárd Páll
On Wed, Jan 30, 2019 at 5:15 PM Tafelmeier, Stefanie wrote: > > Dear all, > > We are facing an issue with the CUDA toolkit. > We tried several combinations of gromacs versions and CUDA Toolkits. No > Toolkit older than 9.2 was possible to try as there are no driver for nvidia > available for a

Re: [gmx-users] Gromacs 2018.5 with CUDA

2019-01-31 Thread Szilárd Páll
On Wed, Jan 30, 2019 at 5:14 PM wrote: > > Vlad, > > 390 is an 'old' driver now. Try something simple like installing CUDA 410.x > see if that resolves the issue. if you need to update the compiler, g++ -7 > may not work, but g++ -6 does. It is worth checking compatibility first. The GROMACS

Re: [gmx-users] gmx covar and gmx anaeig

2019-01-31 Thread David van der Spoel
Den 2019-01-31 kl. 12:28, skrev Özge ENGİN: Hi All, I am working on 3 systems in parallel: 1) protein only, 2) protein+ligand1 and 3) protein+ligand2 using the same protein. Here, I want to get rmsf profiles of the systems along the first and second eigenvectors which can be get by gmx anaeig

Re: [gmx-users] Gromacs 2018.5 with CUDA

2019-01-31 Thread Szilárd Páll
On Wed, Jan 30, 2019 at 4:56 PM Владимир Богданов wrote: > > HI, > > Yes, I think, because it seems to be working with nam-cuda right now: Of course, because in the meantime you upgraded your driver. NAMD, or in fact any program that uses CUDA 9.2 will _not_ run with drivers incompatible with

Re: [gmx-users] Gromacs 2018.5 with CUDA

2019-01-31 Thread Szilárd Páll
On Wed, Jan 30, 2019 at 7:37 AM Владимир Богданов < bogdanov-vladi...@yandex.ru> wrote: > Hey everyone! > > I need help, please. When I try to run MD with GPU I get the next error: > > Command line: > > gmx_mpi mdrun -deffnm md -nb auto > > > > Back Off! I just backed up md.log to ./#md >

[gmx-users] gmx covar and gmx anaeig

2019-01-31 Thread Özge ENGİN
Hi All, I am working on 3 systems in parallel: 1) protein only, 2) protein+ligand1 and 3) protein+ligand2 using the same protein. Here, I want to get rmsf profiles of the systems along the first and second eigenvectors which can be get by gmx anaeig -rmsf option. I want to use the eigenvectors

Re: [gmx-users] multiple GPU usage for simulation

2019-01-31 Thread praveen kumar
Dear Paul Many thanks for your help. As per your suggestion. Now I am able to perform a simulation using two GPUs. Earlier there are some unnecessary flags while installing. Now I have modified installation like this: cmake .. DGMX_THREAD_MPI=ON -DGMX_GPU=ON -DGMX_X11=ON For running simulation

Re: [gmx-users] Gromacs Tutorials

2019-01-31 Thread Benson Muite
Satya, It is helpful to have a subject in your messages. A possible start: http://www.mdtutorials.com/gmx/index.html Benson On 1/31/19 7:18 AM, Satya Ranjan Sahoo wrote: > Sir, > I am a beginner to GROMACS. I was unable to understand how to create all > the ions.mdp , md.mdp , mout.mdp ,