Re: [gmx-users] Gromacs 2018 and GPU PME

2018-02-09 Thread Szilárd Páll
On Fri, Feb 9, 2018 at 4:25 PM, Szilárd Páll wrote: > Hi, > > First of all,have you read the docs (admittedly somewhat brief): > http://manual.gromacs.org/documentation/2018/user-guide/ > mdrun-performance.html#types-of-gpu-tasks > > The current PME GPU was optimized for

Re: [gmx-users] Tests with Threadripper and dual gpu setup

2018-02-09 Thread Szilárd Páll
Hi, Thanks for the report! Did you build with or without hwloc? There is a known issue with the automatic pin stride when not using hwloc which will lead to a "compact" pinning (using half of the cores with 2 threads/core) when <=half of the threads are launched (instead of using all cores 1

Re: [gmx-users] Gromacs 2018 installation failed

2018-02-09 Thread Elton Carvalho
Hello, Qinghua, The error message refers to the standard library. I believe the package that provides this in ubuntu is glibc. Check that it's a current enough version. Another thing is that the liker (ld) needs to support C++11. That's the binutils package. I've had success with version 2.29.

[gmx-users] Tool for molecule drawing

2018-02-09 Thread Momin Ahmad
Hello, does anybody know of a software (with gui!) that allows to draw and assign residue names atom per atom? For example display a molecule with the gui and clicking on the atoms to define the residue name. Thanks in advance, Momin -- Momin Ahmad Karlsruhe Institute of Technology (KIT)

Re: [gmx-users] GMX 2018 regression tests: cufftPlanMany R2C plan failure (error code 5)

2018-02-09 Thread Szilárd Páll
Great to hear! (Also note that one thing we have explicitly focused on is not only peak performance, but to get as close to peak as possible with just a few CPU cores! You should be able to get >75% perf with just 3-5 Xeon or 2-3 desktop cores rather than needing a full fast CPU.) -- Szilárd On

Re: [gmx-users] Gromacs 2018 and GPU PME

2018-02-09 Thread Daniel Kozuch
Szilárd, If I may jump in on this conversation, I am having the reverse problem (which I assume others may encounter also) where I am attempting a large REMD run (84 replicas) and I have access to say 12 GPUs and 84 CPUs. Basically I have less GPUs than simulations. Is there a logical approach

[gmx-users] Domain decomposition for parallel simulations

2018-02-09 Thread Kevin C Chan
Dear Users, I have encountered the problem of "There is no domain decomposition for n nodes that is compatible with the given box and a minimum cell size of x nm" and by reading through the gromacs website and some threads I understand that the problem might be caused by breaking the system into

Re: [gmx-users] GPU load from nvidia-smi

2018-02-09 Thread Szilárd Páll
On Thu, Feb 8, 2018 at 10:20 PM, Mark Abraham wrote: > Hi, > > On Thu, Feb 8, 2018 at 8:50 PM Alex wrote: > > > Got it, thanks. Even with the old style input I now have a 42% speed up > > with PME on GPU. How, how can I express my enormous

[gmx-users] restoring pullf.xvg file

2018-02-09 Thread Nick Johans
Hi, I deleted the pullf.xvg file unintentionally. Is there anyway to restore and reproduce it from other outputs? Best regards -Nick -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read

Re: [gmx-users] Gromacs 2018 and GPU PME

2018-02-09 Thread Alex
Just to quickly jump in, because Mark suggested taken a look at the latest doc and unfortunately I must admit that I didn't understand what I read. I appear to be especially struggling with the idea of gputasks. Can you please explain what is happening in this line? -pme gpu -nb gpu -ntmpi 8

Re: [gmx-users] Gromacs 2018 and GPU PME

2018-02-09 Thread Szilárd Páll
Hi, First of all,have you read the docs (admittedly somewhat brief): http://manual.gromacs.org/documentation/2018/user-guide/mdrun-performance.html#types-of-gpu-tasks The current PME GPU was optimized for single-GPU runs. Using multiple GPUs with PME offloaded works, but this mode hasn't been an

Re: [gmx-users] Gromacs 2018 installation failed

2018-02-09 Thread Qinghua Liao
Hello Elton, Thanks a lot for your help! I just tried to load a binutils library (it was installed on the cluster) and install Gromacs 2018 again, it works now! All the best, Qinghua On 02/09/2018 11:33 PM, Elton Carvalho wrote: If you are in a hurry, you can download the binutils package

Re: [gmx-users] Gromacs 2018 installation failed

2018-02-09 Thread Elton Carvalho
If you are in a hurry, you can download the binutils package from here https://www.gnu.org/software/binutils/ and compile it on your own, setting the PREFIX to a directory in your home, then use $PATH to make your binary the highest priority. Cheers, Elton On Fri, Feb 9, 2018 at 8:17 PM, Qinghua

Re: [gmx-users] Do i need to put POSITION RESTRAINT DURING EQUILIBRATION STAGE ( NVT ) if i am preparing an amorphous sample?

2018-02-09 Thread Krzysztof Makuch
Hi, That's always your call. You perform equilibration to swap between MM (energy minimization) and MD (kinetic energy). In the other word you slowly start MD and prevent unwanted and unrealistic rearrangement in your system. If initial positions are important - you restrain the interesting

Re: [gmx-users] Gromacs 2018 installation failed

2018-02-09 Thread Qinghua Liao
Hello Elton, Thanks a lot for your information, I already sent an e-mail to the administrator, hopefully they will fix it. All the best, Qinghua On 02/09/2018 08:03 PM, Elton Carvalho wrote: Hello, Qinghua, The error message refers to the standard library. I believe the package that

Re: [gmx-users] Gromacs 2018 and GPU PME

2018-02-09 Thread Mark Abraham
On Fri, Feb 9, 2018, 16:57 Daniel Kozuch wrote: > Szilárd, > > If I may jump in on this conversation, I am having the reverse problem > (which I assume others may encounter also) where I am attempting a large > REMD run (84 replicas) and I have access to say 12 GPUs and 84

Re: [gmx-users] restoring pullf.xvg file

2018-02-09 Thread Mark Abraham
Unfortunately not Mark On Fri, Feb 9, 2018, 17:56 Nick Johans wrote: > Hi, > > I deleted the pullf.xvg file unintentionally. Is there anyway to restore > and reproduce it from other outputs? > > Best regards > -Nick > -- > Gromacs Users mailing list > > * Please

Re: [gmx-users] Gromacs 2018 and GPU PME

2018-02-09 Thread Mark Abraham
Hi, On Fri, Feb 9, 2018, 18:05 Alex wrote: > Just to quickly jump in, because Mark suggested taken a look at the > latest doc and unfortunately I must admit that I didn't understand what > I read. I appear to be especially struggling with the idea of gputasks. > Szilard's

Re: [gmx-users] Domain decomposition for parallel simulations

2018-02-09 Thread Mark Abraham
Hi, On Fri, Feb 9, 2018, 17:15 Kevin C Chan wrote: > Dear Users, > > I have encountered the problem of "There is no domain decomposition for n > nodes that is compatible with the given box and a minimum cell size of x > nm" and by reading through the gromacs website

[gmx-users] Do i need to put POSITION RESTRAINT DURING EQUILIBRATION STAGE ( NVT ) if i am preparing an amorphous sample?

2018-02-09 Thread sanjeet kumar singh ch16d012
Hi list, I am preparing an amorphous sample using GROMACS but i am in doubt that during the equilibration stage ( NVT & NPT ) do i need to put position restraint on my polymer as there are no solvent in my system and if i have to use position restraint then why i should do that? THANKS, SK --

Re: [gmx-users] Regarding Beta-alanine structure

2018-02-09 Thread Dilip H N
Hello, I have got the zwitterion structure of beta-alanine from ChEBI website, and the ChEBI Id is *CHEBI:57966.* I downloaded in mol2 format and tried with the CGenFF and swissparam servers to get the charmm FF. but i have got two different charges for each of the case. In case of CGenFF the

[gmx-users] Gromacs 2018 installation failed

2018-02-09 Thread Qinghua Liao
Dear GMX developers, I am trying to install Gromacs2018 with cuda on clusters, the installation was successful on one cluster, but failed on the other cluster. I guess there might be some library missing on the other cluster. For the succeeded one, the operating system is openSUSE 42.2

[gmx-users] Fwd: MMPBSA

2018-02-09 Thread RAHUL SURESH
Dear all I have carried out a protein ligand simulation for 50ns and performed a PBSA calculation for 10-20ns trajectory. I get a positive binding energy. How can I tackle it..? Thank you -- *Regards,* *Rahul Suresh* *Research Scholar* *Bharathiar University* *Coimbatore* -- Gromacs Users