Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread p buscemi
I've finallly got the ducks in order, The command mpirun -np 8 gmx_mpi mdrun -deffnm PVP20k.nvt auto maps the process as: On host xxx 2 GPUs auto-selected for this run. Mapping of GPU IDs to the 8 GPU tasks in the 8 ranks on this node: PP:0,PP:0,PP:0,PP:0,PP:1,PP:1,PP:1,PP:1 this for the 32

Re: [gmx-users] doubt regarding order parameter calculated by gmx chi S2max and S2min

2018-12-19 Thread Mario Andres Rodriguez Pineda
Good afternon. How can i calculate th S2 order parameters??? Thanks. Em qua, 19 de dez de 2018 às 11:51, Dr Tushar Ranjan Moharana < tusharranjanmohar...@gmail.com> escreveu: > Hi Everyone, > > Is the S2max and S2min produced by gmx chi is same as Ss (slow order > parameter)and Sf (fast order

Re: [gmx-users] contact analysis between the all backbone of same protein

2018-12-19 Thread Peter Kroon
Hi Shahee, what cutoff do you mean? You can find sample Martini MDP files here: http://cgmartini.nl/index.php/force-field-parameters/input-parameters. The normal VDW and coulomb cutoffs are 1.1 nm. Peter On 19-12-18 09:30, SHAHEE ISLAM wrote: > thank you so much for your reply. > I have

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread Shi Li
> > > -- > > Message: 3 > Date: Tue, 18 Dec 2018 21:51:41 -0600 > From: paul buscemi > To: "gmx-us...@gromacs.org" > Subject: Re: [gmx-users] error on opening gmx_mpi > Message-ID: > Content-Type: text/plain; charset=utf-8 > > Shi, Thanks fo the note > >

[gmx-users] Install Gromacs from Debian/Ubuntu repository vs build from source

2018-12-19 Thread Zhang Shenqiu
Dear Everyone, I am a beginner to Gromacs, and found Gromacs can be installed with apt-get install gromacs on Debian/Ubuntu. But I hesitate to use it because this option is not listed or mentioned in the installation guide. http://manual.gromacs.org/documentation/2018/install-guide/index.html

Re: [gmx-users] contact analysis between the all backbone of same protein

2018-12-19 Thread Peter Kroon
Hi Shahee, that's a nontrivial question ;) It depends on what you call a contact; it probably has something to do with the vdw radius of the CG beads. Either way, we can't answer that question for you. Peter On 19-12-18 10:57, SHAHEE ISLAM wrote: > hi, > i am doing the contact between the

[gmx-users] contacts

2018-12-19 Thread Yasser Almeida Hernández
Hello, I did a CGMD run of a membrane protein embedded in a model bilayer and I want to compute the contacts between my protein and the PO4 beads of certain lipid. I am using the following command: gmx mindist -s md_protein_membrane_1.tpr -f md_protein_membrane_centered_run1.xtc -or

Re: [gmx-users] Simulation Across Mulitple Nodes with GPUs and PME

2018-12-19 Thread Kutzner, Carsten
Hi, > On 18. Dec 2018, at 18:04, Zachary Wehrspan wrote: > > Hello, > > > I have a quick question about how GROMACs 2018.5 distributes GPU resources > across multiple nodes all running one simulation. Reading the > documentation, I think it says that only 1 GPU can be assigned to the PME >

Re: [gmx-users] contact analysis between the all backbone of same protein

2018-12-19 Thread SHAHEE ISLAM
hi, i am doing the contact between the all backbone beads within a protein by using the below command. gmx mdmat -f *.trr/xtc -s *.tpr -n index.ndx -mean dm.xpm -t 0.5 where -t is trunc distance.what value should i use for martini coarse grained system because by changing -t value the plots are

Re: [gmx-users] contact analysis between the all backbone of same protein

2018-12-19 Thread SHAHEE ISLAM
thank you so much for your reply. I have done according your instruction.Can you please tell me what should best cut off for martini coarse grained force field. thanking you shahee On 12/18/18, soumadwip ghosh wrote: > Hi, > > You can obtain the backbone atom contacts in the same protein using

Re: [gmx-users] Protein-ligand complex simulation and ATBserver file

2018-12-19 Thread Prasanth G, Research Scholar
Dear Sir, I am running GROMACS 5.1.4 on a server. I had converted the protein(pdb2gro) using the latest GROMOS forcefield. As per your suggestion, I had downloaded the parameter files from ATB server. As per the readme file from the server (in the parameters folder), i had updated my topol.top

[gmx-users] doubt regarding order parameter calculated by gmx chi S2max and S2min

2018-12-19 Thread Dr Tushar Ranjan Moharana
Hi Everyone, Is the S2max and S2min produced by gmx chi is same as Ss (slow order parameter)and Sf (fast order parameter) respectively. If not may anybody please explain what are those (S2max and S2min). Thanks a lot. Sincerely Tushar -- Gromacs Users mailing list * Please search the archive

[gmx-users] Discarded angle between 3 virtual sites

2018-12-19 Thread Jonathan Barnoud
Hello, A colleague of mine stumbled upon an issue with angles between 3 virtual sites. If the 3 virtual sites are "Virtual Sites N", then the angle is removed as being a constant-energy interaction. This can be prevented by passing -normvsbds to grompp. Though, using the option is a

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread p buscemi
here is the output from the gmx_mpi command. I would think the correct version of mdrun would be installed. maybe I could point to it in my PATH ?? - hms@rgb2 ~/Desktop/PVP20k $ gmx_mpi :-) GROMACS - gmx_mpi,

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread Justin Lemkul
On Wed, Dec 19, 2018 at 11:44 AM p buscemi wrote: > Shi, > > reinstalling the mpi version using gmx 18.4 did not helpany ideas ? > hms@rgb2 ~/Desktop/PVP20k $ mpirun -np 8 mdrun_mpi -deffnm PVP20k1.em > :-) GROMACS - mdrun_mpi, VERSION 5.1.2 (-: > > You're just calling the same

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread p buscemi
Shi, reinstalling the mpi version using gmx 18.4 did not helpany ideas ? hms@rgb2 ~/Desktop/PVP20k $ mpirun -np 8 mdrun_mpi -deffnm PVP20k1.em :-) GROMACS - mdrun_mpi, VERSION 5.1.2 (-: GROMACS: mdrun_mpi, VERSION 5.1.2 Executable: /usr/bin/mdrun_mpi.openmpi Data prefix: /usr Command

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread pbuscemi
Thank you - both - very much again. The "mpir_run -npx gmx -mdrun." command was lifted from a Feb 2018 response from Szilard , to a multi gpu, user which he used as an example. I'll crank on your pointers right now. Paul -Original Message- From:

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread p buscemi
Getting closer... ( thinking a bit about the initial command structure does help) Now using the command: gmx_mpi mdrun -deffnm PVP20k.nvt -nb gpu -ntomp 16 -npme 4 :-) GROMACS - gmx mdrun, 2018.4 (-: gets past the v5 issue but a new nastygram is sent... "The -dd or -npme option request a

Re: [gmx-users] Protein-ligand complex simulation and ATBserver file

2018-12-19 Thread Justin Lemkul
On Wed, Dec 19, 2018 at 4:34 AM Prasanth G, Research Scholar < prasanthgha...@sssihl.edu.in> wrote: > Dear Sir, > > I am running GROMACS 5.1.4 on a server. > I had converted the protein(pdb2gro) using the latest GROMOS forcefield. > As per your suggestion, I had downloaded the parameter files

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread paul buscemi
Shi, Justin straightened me out regarding the command structure ; Used "mpirun -np 8 gmx_mpi mdrun -deffnm Run_file.nvt” But for time being I’ve given up on two GPUs with the 32 core system. I am now just trying to make the single GPU work well. Paul > On Dec 19, 2018, at 5:51 AM, Shi Li

Re: [gmx-users] Epsilon_r

2018-12-19 Thread Shan Jayasinghe
Hi Justin, Thank you very much. On Tue, Dec 18, 2018 at 12:23 AM Justin Lemkul wrote: > On Mon, Dec 17, 2018 at 1:49 AM Shan Jayasinghe < > shanjayasinghe2...@gmail.com> wrote: > > > Dear Gromacs Users, > > > > How do we determine the epsilon_r for a MD simulation? If we do a MD > > simulation

Re: [gmx-users] Install Gromacs from Debian/Ubuntu repository vs build from source

2018-12-19 Thread Justin Lemkul
On Wed, Dec 19, 2018 at 6:58 PM Zhang Shenqiu wrote: > Dear Everyone, > > I am a beginner to Gromacs, and found Gromacs can be installed with > apt-get install gromacs on Debian/Ubuntu. But I hesitate to use it because > this option is not listed or mentioned in the installation guide. >

Re: [gmx-users] Install Gromacs from Debian/Ubuntu repository vs build from source

2018-12-19 Thread paul buscemi
In addition to Justin’s comments, the repository version is not adapted to GPU/CUDA use and as such is good for only very small systems and so loses one of the great advantages over other MD programs. I is not bad as an introduction to Gromacs so do not be afraid of installing, working with