On 16.10.2015 09:14, Mikhail Stukan wrote:
Dear all,
I am facing the following problem during installation of GROMACS 5.1. After successful
run of cmake, make and make install the only executable file I have in corresponding bin
directory is "gmx" (actually gmx5.1single in my case). No usual
On 11.06.2015 14:07, David McGiven wrote:
Your 1-3% claim is based on the webpage you linked ?
Is it reliable to compare GPU performances for gromacs with those of 3D
videogames ?
OK, you got me on this. As much as I'd wish I cannot
really back up my claim of comparability. I have been
out of
On 11.06.2015 13:08, David McGiven wrote:
We're finally buying some Intel E52650 servers + NVIDIA GTX980 cards.
However, there's some servers that come with only PCI-e 3.0 x8 slots and
others with x16 slots.
Do you think this is relevant for gromacs performance ? And if so, how much
relevant ?
On 15.04.2015 18:41, Andrew DeYoung wrote:
Hi,
I'm running Gromacs 4.5.5. Is there a way to determine how Gromacs was
compiled on my system? I would like to know whether the person who
compiled Gromacs used gcc or icc (both compilers are available and were
presumably available when the
On 11.03.2015 13:39, Rebeca García Fandiño wrote:
a general question...would it have any sense to mixture into a single
simulation coarse-grained and fine-grained parameters?
Usually not, but in special cases yes if you know what
you're doing. What exactly is the problem to investigate,
what
On 24.02.2015 05:08, 라지브간디 wrote:
I have a Windows 7 OS system which has i7, CPU @ 3.40GHz, 16 GB RAM and newly
installed NVIDIA GeForce GX960 with 1 TB memory.
Since i am familiar with gromacs in Linux system, i am not able to install in
windows environment.
Should i use cygwin or Visual
On 16.12.2014 08:16, Albert wrote:
I am going to purchase some GPU workstation and each of them will have 3
dedicated GPUs. I am thinking about GTX690, GTX980 or GTX Titan black.
What CPU(s) does each workstation going to have?
My system is usually typical biological ones which contains
On 16.10.2014 14:38, Hadházi Ádám wrote:
Dear GMX Stuff and Users,
I am planning to buy a new MD workstation with 4 GPU (GTX 780 or 970) or 3
GPU (GTX 980) for 4000$.
Could you recommend me a setup for this machine?
1 or 2 CPU is necessary? 32/64 GB memory? Cooling? Power?
- What system (size,
On 15.10.2014 06:06, Cai wrote:
I am trying to simulate 2-d liquids interacting with simple potential like
Lennard-Jones potential.
Can it be done by specifying pcoupletype = semiisotropic in the .mdp
file? I mean enforcing a normal pressure in x-y direction while very high
pressure in z
On 02.10.2014 11:58, Francesco Mambretti wrote:
Or it is true that GROMACS is not able to use user-defined potentials with
verlet's cutoff? This would be a serious handicap for this code!
Am I forced to use charge groups?
Please read this thread:
On 01.10.2014 03:52, AINUN NIZAR M wrote:
I'm a newbie in MD. I would like to simulate a protein folding, from a
primary protein structure to a fully-folded state for a wild type and
mutated protein. Then I will compare them and make movie from unfolded to
folded state. I want to use REMD
On 29.09.2014 14:31, Pappu Kumar wrote:
I am wondering if anyone tested the performance of new GTX 980 and 970 cards
and compared to 780/780Ti/Titan using the input systems given here
As long as I don't have a card, I can only guess.
But: until the appearance of reasonable benchmarks, you
On 25.09.2014 12:16, Vedat Durmaz wrote:
gromacs version 4.6.5 debian/ubuntu binaries from the ubuntu repositories.
when we start mdrun, we get an german error message saying:
ungültiger maschinenbefehl (something like invalid machine command).
when searching the internet i got the feeling that
On 12.09.2014 09:15, Sathish Kumar wrote:
I constructed one surface for that i have not mention periodic molecule in
the mdp file, and i fix the box such that the distance from x,y and z is 2
nm and then i kept the lipid membrane above the surface.
It's not clear what you did here. Did you
On 12.09.2014 13:30, Sathish Kumar wrote:
Yes i created a block of SiO2 according to the length of the lipid
membrane and kept in the center of the box and i kept the distance from
edges of the box is 2 nm. Here i have not used the periodic-molecules
option in the mdp file. I hydroxylated the
On 12.09.2014 13:30, Sathish Kumar wrote:
And here i attached the md.log file and em.gro file. Please once check.
I did run the SiO2 block of your .gro file through my distance
checker and it looks like you have massive overlap through
the y PBC boundary and one positional duplication:
1709
On 12.09.2014 16:16, Sathish Kumar wrote:
So in SiO2 structure atoms are getting merged, is that the reason for the
error in simulation?
Aside from the y-PBC overlap, please look closer into
your coordinates:
...
...
1882MAK Si 1710 6.303 0.003 4.160
...
...
...
4480SOB BO 6359
I've run into a problem with an older card (GTX-580)
which is CC 2.0. On a larger box size, mdrun stops
with:
Fatal error:
Watch out, the input system is too large to simulate!
The number of nonbonded work units (=number of super-clusters)
exceeds themaximum grid size in x dimension
On 05.08.2014 07:01, Abhishek Acharya wrote:
I am planning on investing in a beowulf cluster with 6 nodes (48 cores) each
with AMD Fx 8350 processor, 8 GB memory connected by 1 Gigabit Ethernet
switch. Although I plan to add more cores to this cluster later on, what is the
max performance
On 22.07.2014 00:51, John Doe wrote:
[ 8%] Building CXX object
src/gromacs/CMakeFiles/libgromacs.dir/gmxpreprocess/fflibutil.cpp.o
/home/User/gromacs/gromacs-5.0/src/gromacs/gmxpreprocess/fflibutil.cpp: In
function ‘int low_fflib_search_file_end(const char*, gmx_bool, const char*,
gmx_bool,
On 14.07.2014 10:21, Nizar wrote:
Can I force GROMACS 5.0 to use both CPU and GPU?
Afaik unfortunately not.
I simulate a protein using GROMACS 5.0 installed on old Macbook Pro Mid-2009
having 2,26 Core2Duo Proc and NVIDIA GeForce 9400M supporting CUDA technology.
My GPU, however has
On 14.07.2014 21:05, elham tazikeh wrote:
how can i running perl script files in my linux ?
This is not a Gromacs-related question. Questions
like these are best answered in appropriate
sites like stackoverflow.com
is this correct?
chmod +x *.pl
yes, but most probably unnecessary
perl
I installed 5.0 on both Linux (GPU) and Windows (GPU)
and it seems to work very good.
Some remarks:
On 29.06.2014 22:53, Mark Abraham wrote:
* The native GPU port available in GROMACS 4.6 supports a wider range of
simulation types, and now requires CUDA 4.0
Should I read this 4.6 as 5.0?
*
On 20.05.2014 08:13, Andrei Neamtu wrote:
We are currently in the process of buying several servers which will
include GPU accelerato.
Because of the price / performance balance in case of our limited funds we
have to choose between K40 and K20X tesla accelerators.
Initially, we wanted 2
On 13.05.2014 11:18, Albert wrote:
I compiled Gromacs in a GPU machine with two GTX780Ti with following
command:
env CC=icc F77=ifort CXX=icpc
CMAKE_PREFIX_PATH=/soft/intel/mkl/include/fftw:/soft/intel/impi/
4.1.3.049/intel64:/soft/intel/mkl/lib/intel64 cmake ..
-DBUILD_SHARED_LIB=OFF
On 13.05.2014 12:00, Albert wrote:
Hello:
thx for reply.
it is very strange, I cannot run this command as normal user:
Failed to initialize NVML: Insufficient Permissions
here is the output I use sudo to run the command you mentioned:
Please look into the solutions posted here:
On 25.04.2014 00:07, Ricardo O. S. Soares wrote:
I ran an NVT equilibration of a 24mi CG Martini atoms and detected the
formation of vacuum spots on the solvent.
CG Martini has many sphere types, did you solvate your system
with 24x10^6 MARTINI-W segments?
MARTINI-W is is strongly
On 03.04.2014 16:10, GtrAngus wrote:
I have a question to all the people who are familiar with C-code. How can I
read in the total number of frames in my *.xtc file? Especially when I do
the analysis somewhere in the middle?
E.g: I have a file with 100.000 frames. When I start the programm with
On 02.04.2014 19:14, Amjad Farooq wrote:
I am running Gromacs 5.0-beta on a remote workstation (next door) from my
PC. The two machines are networked via 1-Gigabit PCI-E ethernet cards (on
each machine) and a CAT-6 ethernet cable.
My question is:
Would my MD simulations benefit in terms of speed
On 01.04.2014 10:36, Ly Minh Nhat wrote:
I don't know, I just follow the instruction. What information should I
need to give?
It's on the fftw3 install page
(http://www.fftw.org/doc/Installation-on-Unix.html)
in the first item:
• --enable-float: Produces a single-precision version of FFTW
On 01.04.2014 21:10, Ly Minh Nhat wrote:
There is no output for source /usr/local/gromacs/bin/GMXRC
ooker@ooker-Aspire-4741:~$ source /usr/local/gromacs/bin/GMXRC
ooker@ooker-Aspire-4741:~$
There is GMXRC file in /usr/local/gromacs/bin.
Yes.
And after this 'source command', what does
$
On 13.03.2014 13:25, Szilárd Páll wrote:
7) -DCMAKE_INSTALL_PREFIX=/cygdrive/c/Gromacs465 is the installation
directory -DGMX_MPI=OFF With the second installation 10) to 13) you will replace the
non-mpi mdrun.exe with one that can run on multiple cores. If you omit this step, the
generated
Hi Gupta, :
Is there any problem with server or running on multiple thread.
your system seems to be very small and is probably subject
to strong fluctuations in pressure and/or energy. Could
you please create two directories containing (only) the
- mdp file for input (your md.mdp)
- topology
On 20.02.2014 12:40, Archana Sonawani-Jagtap wrote:
I have done most of the simulations using this version. However, I had
to format my PC and have centos 6. Does it matter a lot if I compare
simulations run using different versions of gromacs for publication.
There is always a non-zero
On 29.01.2014 11:15, Mark Abraham wrote:
* lots of internal cleaning of the house
* various bug fixes
Status report on windows:
Build Perfectly!
Builds error-free out-of-the-box using
Visual C++ 2012 (VC11), cmake 2.8.12.1,
Cuda 5.5, already installed Boost 1.55,
already installed
On 29.01.2014 15:06, Mirco Wahab wrote:
A test run (martini small vesicle system,
250K atoms) did start promising but crashed
after 15 min without any indication. (I'll
try without gpu later).
I identified the problem. It is the same problem
that also required a patch in the 4.6.x versions
On 22.01.2014 15:03, Jianguo Li wrote:
I have an doubt on the GPU hardware. Recently I heard from a nvidia person that
unlike Kepler card, GTX card does not have the memory check and correction. It
is no problem for gaming, but may lead to wrong results for scientific
computing.
Fermi,
On 20.01.2014 11:30, Carlos Familia wrote:
I was wondering if anybody come to this error with gromacs compiled under
cygwin 32bits in windows 7.
...
tMPI error: I/O or system error: No such file or directory
procedure.txt: line 7: 5624 Aborted (core dumped) mdrun -v
Gromacs' mdrun won't
On 21.12.2013 00:17, Patrick Fuchs wrote:
Hi,
to follow up on this, the simulation with Reaction-Field-nec under
4.5.3 has completed. The final area is 0.60 nm^2 (see
http://redmine.gromacs.org/issues/1400 for the plot). Lutz, you
mentionned you tried a simulation with nstlist=1, could you give
On 16.12.2013 21:10, Lutz Maibaum wrote:
I am running MD simulations of DPPC bilayers with the Gromos force field, and I
am seeing some differences between using Gromacs 4.0.7 and 4.5.5/4.6.5 that I
do not understand. It would be great if someone had any insight into what's
going on here.
On 13.12.2013 18:01, chem grad wrote:
Thank you so much!
Gromacs seems to be installed properly now.
but mdrun doesn't work, right?
M.
--
Gromacs Users mailing list
* Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post?
On 13.12.2013 21:38, chem grad wrote:
I cannot convince cygwin to source the gromacs files properly (well, at least I think that's the
problem). When I enter luck into the command line it returns the luck: command
not found error message.
Also, both commands I have entered (pdb2gmx and mdrun)
42 matches
Mail list logo