Intel nodes with 1 or 2 GPUs will give you the best performance and the best
performance/price.
Best,
Carsten
Thanks again.
Best,
D
2015-01-16 14:46 GMT+01:00 Carsten Kutzner ckut...@gwdg.de:
Hi David,
On 16 Jan 2015, at 12:28, David McGiven davidmcgiv...@gmail.com wrote:
Hi
/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de
/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten
Hi David,
On 16 Jan 2015, at 12:28, David McGiven davidmcgiv...@gmail.com wrote:
Hi Carsten,
Thanks for your answer.
2015-01-16 11:11 GMT+01:00 Carsten Kutzner ckut...@gwdg.de:
Hi David,
we are just finishing an evaluation to find out which is the optimal
hardware for Gromacs
/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11
] Im Auftrag von
Carsten Kutzner
Gesendet: Donnerstag, 8. Januar 2015 15:51
An: gmx-us...@gromacs.org
Betreff: Re: [gmx-users] g_tune_pme_mpi on GPU cluster fails
On 08 Jan 2015, at 15:32, Ebert Maximilian m.eb...@umontreal.ca wrote:
Hi Carsten,
I was benchmarking my first system and I
-boun...@maillist.sys.kth.se] Im Auftrag von
Carsten Kutzner
Gesendet: Donnerstag, 8. Januar 2015 15:48
An: gmx-us...@gromacs.org
Betreff: Re: [gmx-users] Performance difference between MPI ranks and OpenMP
On 08 Jan 2015, at 15:38, Ebert Maximilian m.eb...@umontreal.ca wrote:
Hi list
---
Error on rank 1, will try to stop all ranks Halting parallel program gmx_mpi
on CPU 1 out of 4
-Ursprüngliche Nachricht-
Von: gromacs.org_gmx-users-boun...@maillist.sys.kth.se
[mailto:gromacs.org_gmx-users-boun...@maillist.sys.kth.se] Im Auftrag von
Carsten Kutzner
be iso, iso-pf, pm, pm-pf, rm, rm-pf, rm2,
rm2-pf, flex, flex-t, flex2, flex2-t
rot_type0= flex
Use flex-t instead, this should do the trick.
What works with pulling: iso-pf, pm-pf, rm-pf, rm2-pf, flex-t, flex2-t
Happy pulling!
Carsten
--
Dr. Carsten Kutzner
Max Planck
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077
://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551
/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11
)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49
://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen
/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http
From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se
[gromacs.org_gmx-users-boun...@maillist.sys.kth.se] on behalf of Carsten
Kutzner [ckut...@gwdg.de]
Sent: Wednesday, November 19, 2014 11:43 AM
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] Installing
. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/grubmueller/kutzner
http://www.mpibpc.mpg.de/grubmueller/sppexa
--
Gromacs Users mailing
.
Much appreciated if anyone could provide such information about how
reliable the 980s are.
Best,
Jane
On Tue, Nov 4, 2014 at 1:36 PM, Carsten Kutzner ckut...@gwdg.de wrote:
Hi,
On 04 Nov 2014, at 22:33, Jian Yin janeyin...@gmail.com wrote:
Hi there!
I just wonder what
://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten
://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max
Of
Carsten Kutzner
Sent: Montag, 29. September 2014 19:23
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users] g_tune_pme_mpi is not compatible to mdrun_mpi
On 29 Sep 2014, at 18:40, Mark Abraham mark.j.abra...@gmail.com wrote:
Hi,
That seems suitable.
Oh, it just occurred to me
---
-Original Message-
From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se
[mailto:gromacs.org_gmx-users-boun...@maillist.sys.kth.se] On Behalf Of
Carsten Kutzner
Sent: Donnerstag, 25. September 2014 19:29
To: gmx-us...@gromacs.org
Subject: Re: [gmx-users
Rudi van DrunenAnton Feenstra Sebastian Fritsch
Gerrit GroenhofChristoph Junghans Peter Kasson Carsten Kutzner
Per LarssonJustin A. Lemkul Magnus LundborgPieter Meulenhoff
Erik Marklund Teemu Murtola Szilard Pall Sander Pronk
Roland Schulz
anyway.
So even if you say
g_tune_pme -np 48 -s input.tpr
we first check with
mpirun -np 2 mdrun -s input.tpr
and only after that continue with -np 48.
Carsten
We should just make the check optional, instead of being a deal
breaker.
Mark
On Sep 29, 2014 4:35 PM, Carsten Kutzner ckut
visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313
for this case I guess we also need to make the test optional.
Carsten
Mark
On Mon, Sep 29, 2014 at 6:32 PM, Carsten Kutzner ckut...@gwdg.de wrote:
Hi,
On 29 Sep 2014, at 18:17, Mark Abraham mark.j.abra...@gmail.com wrote:
Hi,
It can't be fixed, because there is no surefire way
-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/grubmueller/kutzner
posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax
/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077
/stupid?
Cheers,
Oliver
On 08/29/2014 05:15 PM, Carsten Kutzner wrote:
Hi Dawei,
On 29 Aug 2014, at 16:52, Da-Wei Li lida...@gmail.com wrote:
Dear Carsten
Thanks for the clarification. Here it is my benchmark for a small protein
system (18k atoms).
(1) 1 node (12 cores/node, no GPU
nodes will result 33 ns/day, that is, it is about 3
time slower than MD run on one node (2GPU+12core).
I have no idea what is wrong.
dawei
On Mon, Sep 1, 2014 at 5:34 AM, Carsten Kutzner ckut...@gwdg.de wrote:
Hi,
take a look at mdrun’s hidden but sometimes useful options:
mdrun
, and the last 5 DD ranks to GPU id 1.
Carsten
dawei
On Mon, Sep 1, 2014 at 8:39 AM, Carsten Kutzner ckut...@gwdg.de wrote:
Hi Dawei,
on two nodes, regarding the cases with and without GPUs,
do you use the same domain decomposition in both cases?
Carsten
On 01 Sep 2014, at 14:30
Hi Dawei,
the mapping of GPUs to PP ranks is printed for the Master node only,
but if this node reports two GPUs, then all other PP ranks will also
use two GPUs (or an error is reported).
The scaling will depend also on your system size, if this is too small,
then you might be better off by
, 2014 at 10:36 AM, Carsten Kutzner ckut...@gwdg.de wrote:
Hi Dawei,
the mapping of GPUs to PP ranks is printed for the Master node only,
but if this node reports two GPUs, then all other PP ranks will also
use two GPUs (or an error is reported).
The scaling will depend also on your
On 22 Aug 2014, at 12:48, xiexiao...@sjtu.edu.cn wrote:
Does anyone know that what the pme ranks mean?
See for example [1] in the section
Multiple-Program, Multiple-Data PME Parallelization.
Best,
Carsten
1. Hess, B., Kutzner, C., van der Spoel, D. Lindahl, E. GROMACS 4:
Algorithms
://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am
post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical
-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/grubmueller/kutzner
posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical
, Christoph Junghans,
Peter Kasson, Carsten Kutzner, Per Larsson, Pieter Meulenhoff,
Teemu Murtola, Szilard Pall, Sander Pronk, Roland Schulz,
Michael Shirts, Alfons Sijbers, Peter Tieleman,
Berk Hess, David van der Spoel, and Erik Lindahl
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen
-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/grubmueller/kutzner
On 02 Jul 2014, at 12:55, ABEL Stephane 175950 stephane.a...@cea.fr wrote:
Hello,
in short : it is possible ? I use the gromacs v4.6.5.
No.
Carsten
Thanks
Stéphane
--
Gromacs Users mailing list
* Please search the archive at
requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
mail to gmx-users-requ...@gromacs.org.
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49
Hi,
technically, you can use PME with charged systems in Gromacs. However,
if the charges in your system are distributed inhomogeneously (as, e.g.
in a water/membrane system), you will get artefacts, as described in
http://www.mpibpc.mpg.de/14063977/Hub_2014_JCTC.pdf
Best,
Carsten
On 23 Apr
the bug report - http://redmine.gromacs.org/issues/1460 - and
assigned it to you. I also set the priority to low, don't know if it
matters.
Cheers,
João
2014-03-13 18:04 GMT+01:00 Carsten Kutzner ckut...@gwdg.de:
Dear João,
On 13 Mar 2014, at 14:38, João Rodrigues anar...@gmail.com
?
Thanks for reporting this!
Also, why is my rlist changing since it is equal to rcoulomb? It should be
kept the same (line 1004) right?
This is a feature of the Verlet scheme, see
http://www.gromacs.org/Documentation/Cut-off_schemes
Best,
Carsten
--
Dr. Carsten Kutzner
Max Planck
- and
assigned it to you. I also set the priority to low, don't know if it
matters.
Cheers,
João
2014-03-13 18:04 GMT+01:00 Carsten Kutzner ckut...@gwdg.de:
Dear João,
On 13 Mar 2014, at 14:38, João Rodrigues anar...@gmail.com wrote:
Hi all,
I've been playing with g_tune_pme (neat
On 04 Mar 2014, at 13:08, Alexander Björling alex.bjorl...@gmail.com wrote:
Dear users,
I'm trying to run simulations on a cluster of nodes, each sporting two AMD
Opteron 8-core CPU:s. I would like to have one MPI process on each CPU,
with OpenMP threads on the 8 cores of each. I've
On 04 Mar 2014, at 13:43, alex.bjorling alex.bjorl...@gmail.com wrote:
Carsten Kutzner wrote
On 04 Mar 2014, at 13:08, Alexander Björling lt;
alex.bjorling@
gt; wrote:
Dear users,
I'm trying to run simulations on a cluster of nodes, each sporting two
AMD
Opteron 8-core CPU:s. I
Hi Mousumi,
from the fact that you get lots of backup files directly at the beginning
I suspect that your mdrun is not MPI-enabled. This behavior is exactly what
one would get when launching a number of serial mdrun’s on the same input file.
Maybe you need to look for a mdrun_mpi executable.
Hi,
start with using as many MPI processes as you have GPUs. GROMACS
will use several OpenMP threads per MPI process to use all your CPU
cores.
You can also do that manually with
mpirun -np 2 mdrun-mpi -ntomp 6
Carsten
On 12/10/2013 10:30 AM, rajat desikan wrote:
Dear all,
I recently
58 matches
Mail list logo