[gmx-users] Problem with domain decomposition

2016-08-11 Thread hamedifatemeh67
HiGROMACS users I am running mdrun andgetting following error: Fatalerror: DD cell 0 0 3could only obtain 1065 of the 1066 atoms that are connected via constraintsfrom the neighboring cells. This probably means your constraint lengths are toolong compared to the domain decomposition cell

Re: [gmx-users] umbrella sampling for determining interfacial energy

2016-08-11 Thread Ray Chao
Thank you very much.​ -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting! * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists * For (un)subscribe requests visit

Re: [gmx-users] umbrella sampling for determining interfacial energy

2016-08-11 Thread Dan Gil
Hi, this is a good coincidence, but I think I do something very similar to what you just described. I use umbrella sampling to estimate the free energy profile as a function of distance from a liquid-vapor interface. Then, the free energy cost of adsorption to the interface can be approximated. I

Re: [gmx-users] radial-density profile and radial distribution curve for water-CNT system

2016-08-11 Thread Dan Gil
The manual for version 4.6 is available online. The analysis command you want is g_rdf, for version 4.6. g_rdf plots the radial distribution function, which is what I think you are calling radial distribution curve. The manual has a nice explanation of it so please check it out. The radial density

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Albert
I just found that I compiled PLumed plugin with a different MPI, and then patched Gromacs. Now, I recompiled everything from scratch, finally it works. thx a lot On 08/11/2016 05:55 PM, Szilárd Páll wrote: It should. You can always verify it in the header of the log file. It's always useful

Re: [gmx-users] Trying to add Buck.ham (SR) while the default nonbond type is LJ (SR)

2016-08-11 Thread Mark Abraham
Hi, How do you want your atoms with lj parameters to interact with atoms with Buckingham parameters? Mark On Fri, 5 Aug 2016 08:53 Andreas Mecklenfeld < a.mecklenf...@tu-braunschweig.de> wrote: > Dear Gromacs-users, > > I'm trying to modify some intermolecular Lennard-Jones interactions >

Re: [gmx-users] Can somebody explain how to set (XY) hexagonal symmetry pbc in gromacs ?

2016-08-11 Thread Mark Abraham
Hi, The trick is to recognise that a hexagonal cell is equivalent to the triclinic cell that is formed from the centres of four adjacent hexagonal cells. You need to describe that triclinic cell. Probably the recipes you can find in the archive are doing just that. Mark On Fri, 5 Aug 2016 11:41

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Szilárd Páll
On Thu, Aug 11, 2016 at 4:22 PM, Albert wrote: > well, here is the command line I used for compiling: > > > env CC=mpicc CXX=mpicxx F77=mpif90 FC=mpif90 LDF90=mpif90 > CMAKE_PREFIX_PATH=/soft/gromacs/fftw-3.3.4:/soft/intel/impi/5.1.3.223 cmake > .. -DBUILD_SHARED_LIB=OFF

Re: [gmx-users] LINCS warnings at high temperature md run

2016-08-11 Thread Mark Abraham
Hi, Fundamentally, at higher temperature you have higher atomic velocities, so atoms move further in a step. Your simulation is only stable if you apply constraints, but the default settings are chosen for normal temperatures and thus displacements. So try the kinds of things Chris suggests.

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Mark Abraham
Hi, Configuration of MPI also happens when mpirun acts. You need to have set things up so that those two ranks are assigned to hardware the way you want. Your output looks like there are two processes, but that they aren't organised by mpirun to know to talk to each other. Mark On Thu, 11 Aug

Re: [gmx-users] Creating topology for Cu-containing enzyme, GROMOS96 force field.

2016-08-11 Thread João M . Damas
Hi Francesca, For previous works on copper proteins ( http://pubs.acs.org/doi/abs/10.1021/ct500196e), I have used specbond.dat as Marlon suggested for coppers bound to the protein. For coppers bound to a co-factor, I would assume you are building an .itp for the co-factor, so I would include them

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Szilárd Páll
On Wed, Aug 10, 2016 at 4:03 PM, Albert wrote: > Hello: > > I am trying to submit gromacs jobs with command line: > > mpirun -np 2 gmx_mpi mdrun -s 61.tpr -v -g 61.log -c 61.gro -x 61.xtc -ntomp > 10 -gpu_id 01 > > However, it failed with messages: > > > >Number of GPUs

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Szilárd Páll
PPS: given the double output (e.g. "Reading file 61.tpr, ...") it's even more likely that you're using a non-PI build. BTW, looks like you had the same issue about two years ago: https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2014-September/092046.html -- Szilárd On Thu, Aug 11,

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Szilárd Páll
Using a non-MPI launch command won't be useful in starting an MPI-enabled build, so that's not correct. Additionally, please use _reply_ to answer emails to avoid breaking threads. -- Szilárd On Thu, Aug 11, 2016 at 6:50 AM, Nikhil Maroli wrote: > gmx mdrun -nt X -v

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Szilárd Páll
PS: Or your GROMACS installation uses _mpi suffixes, but it is actually not building with MPI enabled. -- Szilárd On Thu, Aug 11, 2016 at 4:05 PM, Szilárd Páll wrote: > On Wed, Aug 10, 2016 at 4:03 PM, Albert wrote: >> Hello: >> >> I am trying to

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Justin Lemkul
On 8/11/16 9:37 AM, Albert wrote: Here is what I got for command: mpirun -np 2 gmx_mpi mdrun -v -s 62.tpr -gpu_id 0 It seems that it still used 1 GPU instead of 2. I don't understand why.

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Albert
Here is what I got for command: mpirun -np 2 gmx_mpi mdrun -v -s 62.tpr -gpu_id 0 It seems that it still used 1 GPU instead of 2. I don't understand why.

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Justin Lemkul
On 8/11/16 9:08 AM, Albert wrote: Hi, I used your suggested command line, but it failed with the following messages: --- Program gmx mdrun, VERSION 5.1.3 Source code file:

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread jkrieger
I'd suggest installing another gromacs version without MPI then. Your system doesn't have enough CPU nodes to support it I imagine as you asked for 2 and got 1. You could try the following first though: mpirun -np 2 gmx_mpi mdrun -ntomp 10 -v -s 62.tpr -gpu_id 01 That way rather than having 1

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Albert
Hi, I used your suggested command line, but it failed with the following messages: --- Program gmx mdrun, VERSION 5.1.3 Source code file: /home/albert/Downloads/gromacs/gromacs-5.1.3/src/gromacs/gmxlib/gmx_detect_hardware.cpp, line: 458

[gmx-users] Vibrational Power Spectrum

2016-08-11 Thread Alexander Alexander
Dear gromacs user, Would you please let me know which kind of experimental spectrum (IR, Raman, infrared, XRD ...) would be comparable to the "Vibrational Power Spectrum" calculable via velocity auto correlation function by gmx velacc in gromacs? Thanks, Regards, Alex -- Gromacs Users mailing

Re: [gmx-users] Creating topology for Cu-containing enzyme, GROMOS96 force field.

2016-08-11 Thread Justin Lemkul
On 8/10/16 3:26 PM, Francesca Lønstad Bleken wrote: I am interested in a metalloenzyme with Cu and I have found several studies in the literature on systems similar to mine using GROMACS and the Gromos force field. I see that GROMOS contains parameters for Cu, and I intend to keep the

Re: [gmx-users] produce charrmm topology

2016-08-11 Thread Justin Lemkul
On 8/11/16 4:37 AM, a.om...@shirazu.ac.ir wrote: Thankyou I have some problem about pythoon and packages of that for converting it, if I couldnt do that, I will ask you. now I have an other problem: I have defined a new residue included of an ASN + 5 sugures, I have used charmm36 ff , but I

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread jkrieger
The problem is you compiled gromacs with mpi (hence the default _mpi in your command). You therefore need to set the number of mpi processes rather than threads. The appropriate command would instead be the following: mpirun -np 2 gmx_mpi mdrun -v -s 62.tpr -gpu_id 01 Alternatively you could

Re: [gmx-users] MPI GPU job failed

2016-08-11 Thread Albert
Hello: I try to run command: gmx_mpi mdrun -nt 2 -v -s 62.tpr -gpu_id 01 but it failed with messages: --- Program gmx mdrun, VERSION 5.1.3 Source code file:

Re: [gmx-users] Creating topology for Cu-containing enzyme, GROMOS96 force field.

2016-08-11 Thread Marlon Sidore
Hello, pdb2gmx needs the parameters for your co-factors, else it won't recognize it. It should recognize the Cu alone though, if it has the same name as in the topology. You will probably need to obtain the parameters for your co-factors from the relevant papers. If you can get ready-to-use

[gmx-users] Problem compiling on Bluegene/Q

2016-08-11 Thread Jernej Zidar
Hi guys, I am trying to compile Gromacs 2016 for a Bluegene/Q machine and I've encountered a small error during the cmake/configure stage: [ihpczidj@cumulus gromacs-build]$ rm -rf * && cmake ../gromacs-2016 -DCMAKE_TOOLCHAIN_FILE=Platform/BlueGeneQ-static-bgclang-CXX -DGMX_MPI=ON

[gmx-users] radial-density profile and radial distribution curve for water-CNT system

2016-08-11 Thread Ankita Joshi
Dear Gromacs Users, I am working on a water- double walled carbon nanotube system. After the completion of simulation using Gromacs 4.6.5, it was found that the water molecules are present inside the double walled carbon nanotube as well as in the bulk of the system. I want to plot the radial