HiGROMACS users
I am running mdrun andgetting following error:
Fatalerror:
DD cell 0 0 3could only obtain 1065 of the 1066 atoms that are connected via
constraintsfrom the neighboring cells. This probably means your constraint
lengths are toolong compared to the domain decomposition cell
Thank you very much.
--
Gromacs Users mailing list
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
Hi, this is a good coincidence, but I think I do something very similar to
what you just described.
I use umbrella sampling to estimate the free energy profile as a function
of distance from a liquid-vapor interface. Then, the free energy cost of
adsorption to the interface can be approximated. I
The manual for version 4.6 is available online. The analysis command you
want is g_rdf, for version 4.6. g_rdf plots the radial distribution
function, which is what I think you are calling radial distribution curve.
The manual has a nice explanation of it so please check it out. The radial
density
I just found that I compiled PLumed plugin with a different MPI, and
then patched Gromacs.
Now, I recompiled everything from scratch, finally it works.
thx a lot
On 08/11/2016 05:55 PM, Szilárd Páll wrote:
It should. You can always verify it in the header of the log file.
It's always useful
Hi,
How do you want your atoms with lj parameters to interact with atoms with
Buckingham parameters?
Mark
On Fri, 5 Aug 2016 08:53 Andreas Mecklenfeld <
a.mecklenf...@tu-braunschweig.de> wrote:
> Dear Gromacs-users,
>
> I'm trying to modify some intermolecular Lennard-Jones interactions
>
Hi,
The trick is to recognise that a hexagonal cell is equivalent to the
triclinic cell that is formed from the centres of four adjacent hexagonal
cells. You need to describe that triclinic cell. Probably the recipes you
can find in the archive are doing just that.
Mark
On Fri, 5 Aug 2016 11:41
On Thu, Aug 11, 2016 at 4:22 PM, Albert wrote:
> well, here is the command line I used for compiling:
>
>
> env CC=mpicc CXX=mpicxx F77=mpif90 FC=mpif90 LDF90=mpif90
> CMAKE_PREFIX_PATH=/soft/gromacs/fftw-3.3.4:/soft/intel/impi/5.1.3.223 cmake
> .. -DBUILD_SHARED_LIB=OFF
Hi,
Fundamentally, at higher temperature you have higher atomic velocities, so
atoms move further in a step. Your simulation is only stable if you apply
constraints, but the default settings are chosen for normal temperatures
and thus displacements. So try the kinds of things Chris suggests.
Hi,
Configuration of MPI also happens when mpirun acts. You need to have set
things up so that those two ranks are assigned to hardware the way you
want. Your output looks like there are two processes, but that they aren't
organised by mpirun to know to talk to each other.
Mark
On Thu, 11 Aug
Hi Francesca,
For previous works on copper proteins (
http://pubs.acs.org/doi/abs/10.1021/ct500196e), I have used specbond.dat as
Marlon suggested for coppers bound to the protein. For coppers bound to a
co-factor, I would assume you are building an .itp for the co-factor, so I
would include them
On Wed, Aug 10, 2016 at 4:03 PM, Albert wrote:
> Hello:
>
> I am trying to submit gromacs jobs with command line:
>
> mpirun -np 2 gmx_mpi mdrun -s 61.tpr -v -g 61.log -c 61.gro -x 61.xtc -ntomp
> 10 -gpu_id 01
>
> However, it failed with messages:
>
>
>
>Number of GPUs
PPS: given the double output (e.g. "Reading file 61.tpr, ...") it's
even more likely that you're using a non-PI build.
BTW, looks like you had the same issue about two years ago:
https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2014-September/092046.html
--
Szilárd
On Thu, Aug 11,
Using a non-MPI launch command won't be useful in starting an
MPI-enabled build, so that's not correct.
Additionally, please use _reply_ to answer emails to avoid breaking threads.
--
Szilárd
On Thu, Aug 11, 2016 at 6:50 AM, Nikhil Maroli wrote:
> gmx mdrun -nt X -v
PS: Or your GROMACS installation uses _mpi suffixes, but it is
actually not building with MPI enabled.
--
Szilárd
On Thu, Aug 11, 2016 at 4:05 PM, Szilárd Páll wrote:
> On Wed, Aug 10, 2016 at 4:03 PM, Albert wrote:
>> Hello:
>>
>> I am trying to
On 8/11/16 9:37 AM, Albert wrote:
Here is what I got for command:
mpirun -np 2 gmx_mpi mdrun -v -s 62.tpr -gpu_id 0
It seems that it still used 1 GPU instead of 2. I don't understand why.
Here is what I got for command:
mpirun -np 2 gmx_mpi mdrun -v -s 62.tpr -gpu_id 0
It seems that it still used 1 GPU instead of 2. I don't understand why.
On 8/11/16 9:08 AM, Albert wrote:
Hi, I used your suggested command line, but it failed with the following
messages:
---
Program gmx mdrun, VERSION 5.1.3
Source code file:
I'd suggest installing another gromacs version without MPI then. Your
system doesn't have enough CPU nodes to support it I imagine as you asked
for 2 and got 1. You could try the following first though:
mpirun -np 2 gmx_mpi mdrun -ntomp 10 -v -s 62.tpr -gpu_id 01
That way rather than having 1
Hi, I used your suggested command line, but it failed with the following
messages:
---
Program gmx mdrun, VERSION 5.1.3
Source code file:
/home/albert/Downloads/gromacs/gromacs-5.1.3/src/gromacs/gmxlib/gmx_detect_hardware.cpp,
line: 458
Dear gromacs user,
Would you please let me know which kind of experimental spectrum (IR,
Raman, infrared, XRD ...) would be comparable to the "Vibrational Power
Spectrum" calculable via velocity auto correlation function by gmx velacc
in gromacs?
Thanks,
Regards,
Alex
--
Gromacs Users mailing
On 8/10/16 3:26 PM, Francesca Lønstad Bleken wrote:
I am interested in a metalloenzyme with Cu and I have found several studies
in the literature on systems similar to mine using GROMACS and the Gromos
force field. I see that GROMOS contains parameters for Cu, and I intend to
keep the
On 8/11/16 4:37 AM, a.om...@shirazu.ac.ir wrote:
Thankyou
I have some problem about pythoon and packages of that for converting it, if I
couldnt do that, I will ask you.
now I have an other problem:
I have defined a new residue included of an ASN + 5 sugures, I have used
charmm36 ff , but I
The problem is you compiled gromacs with mpi (hence the default _mpi in
your command). You therefore need to set the number of mpi processes
rather than threads. The appropriate command would instead be the
following:
mpirun -np 2 gmx_mpi mdrun -v -s 62.tpr -gpu_id 01
Alternatively you could
Hello:
I try to run command:
gmx_mpi mdrun -nt 2 -v -s 62.tpr -gpu_id 01
but it failed with messages:
---
Program gmx mdrun, VERSION 5.1.3
Source code file:
Hello,
pdb2gmx needs the parameters for your co-factors, else it won't recognize
it. It should recognize the Cu alone though, if it has the same name as in
the topology.
You will probably need to obtain the parameters for your co-factors from
the relevant papers. If you can get ready-to-use
Hi guys,
I am trying to compile Gromacs 2016 for a Bluegene/Q machine and I've
encountered a small error during the cmake/configure stage:
[ihpczidj@cumulus gromacs-build]$ rm -rf * && cmake ../gromacs-2016
-DCMAKE_TOOLCHAIN_FILE=Platform/BlueGeneQ-static-bgclang-CXX -DGMX_MPI=ON
Dear Gromacs Users,
I am working on a water- double walled carbon nanotube system. After the
completion of simulation using Gromacs 4.6.5, it was found that the water
molecules are present inside the double walled carbon nanotube as well as in
the bulk of the system. I want to plot the radial
28 matches
Mail list logo