Re: [gmx-users] GROMACS performance issues on POWER9/V100 node

2020-04-23 Thread Kevin Boyd
t NIST is having the same or similar problems with > POWER9/V100. > > Jon > > From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se < > gromacs.org_gmx-users-boun...@maillist.sys.kth.se> on behalf of Kevin > Boyd > Sent: Thursday, April 23, 2

Re: [gmx-users] GROMACS performance issues on POWER9/V100 node

2020-04-23 Thread Kevin Boyd
Hi, Can you post the full log for the Intel system? I typically find the real cycle and time accounting section a better place to start debugging performance issues. A couple quick notes, but need a side-by-side comparison for more useful analysis, and these points may apply to both systems so

Re: [gmx-users] Question about Mean Square Displacement (MSD)

2020-04-19 Thread Kevin Boyd
suggestions but the problem is the difference > between my answer and GROMACS in calculated MSD. I performed 6 ns > simulation just for checking my MSD results and I'm not going to calculate > the diffusion coefficient from it. > > On Sun, 19 Apr 2020 at 02:33, Kevin Boyd wrote

Re: [gmx-users] Question about Mean Square Displacement (MSD)

2020-04-18 Thread Kevin Boyd
I saved positions every 10 ps for a 6000 > ps simulation. should I lower this or is there another way for using more > trajectories? > > On Sun, 19 Apr 2020 at 00:10, Kevin Boyd wrote: > > > Hi, > > > > Are you talking about the reported diffusion coefficient or th

Re: [gmx-users] Question about Mean Square Displacement (MSD)

2020-04-18 Thread Kevin Boyd
Hi, Are you talking about the reported diffusion coefficient or the MSD vs lag plot? You should be very careful about where you fit. By default, Gromacs calculates MSDs at much longer lag times than you typically have good data for. Use the -beginfit and -endfit options to restrict the fit to the

Re: [gmx-users] Spec'ing for new machines (again!)

2020-04-17 Thread Kevin Boyd
d be any > additional speed boost if we also used AMD GPUs. > > Nope, haven't seen the paper, but quite interested in checking it out. > Is this the latest version? > https://onlinelibrary.wiley.com/doi/abs/10.1002/jcc.26011 > > Thank you, > > Alex > > On 4/17/2020 6

Re: [gmx-users] Spec'ing for new machines (again!)

2020-04-17 Thread Kevin Boyd
Hi, AMD CPUs work fine with Nvidia GPUs, so feel free to use AMD as a base regardless of the GPUs you end up choosing. In my experience AMD CPUs have had great value. A ratio of ~4 cores/ GPU shouldn't be a problem. 256 GB of RAM is very much overkill, but perhaps you have other uses for the

Re: [gmx-users] Unable to compile GROMACS 2020.1 using GNU 7.5.0

2020-04-04 Thread Kevin Boyd
Hi, I've had problems in the past with syntax requirements for CMAKE_PREFIX_PATH. Try putting the path in quotes and separating with a semicolon instead of a colon. Kevin On Sat, Apr 4, 2020 at 1:40 PM Wei-Tse Hsu wrote: > *Message sent from a system outside of UConn.* > > > Dear gmx users, >

Re: [gmx-users] Optimising mdrun

2020-04-01 Thread Kevin Boyd
Hi - > This setting is using 16 MPI process and using 2 OpenMP thrads per MPI proces With ntomp 1 you should only be getting one OpenMP thread, not sure why that's not working. Can you post a link to a log file? For a small system like that and a powerful GPU, you're likely going to have some

Re: [gmx-users] About membrane leaflets and msd

2020-03-18 Thread Kevin Boyd
Hi, Yes, that's a reasonable approach. Check out gmx select, if say you know your center of membrane's z location, you can select for phosphates > that center point, which will give you your top leaflet. Kevin On Tue, Mar 17, 2020 at 12:49 PM Poncho Arvayo Zatarain < poncho_8...@hotmail.com>

Re: [gmx-users] GPU considerations for GROMACS

2020-02-19 Thread Kevin Boyd
Hi, A CPU:GPU ratio of 4:1 is fairly well balanced these days (depending on the quality of the hardware), so you should expect to roughly double your throughput adding a second GPU to your current system. However, that doesn't mean your single simulation performance will double - it's a lot more

Re: [gmx-users] LJ interactions in gromacs

2020-02-16 Thread Kevin Boyd
Hi, A few groups have done things like this to shape membranes. See https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5700167/ and https://pubs.acs.org/doi/10.1021/acs.jctc.8b00765 I was involved in the second publication, so feel free to contact me about implementation details if they're unclear.

Re: [gmx-users] Problem with mpirun

2020-02-08 Thread Kevin Boyd
Hi, Can you send us the output of gmx_mpi --version? I typically see illegal instructions when I compile gromacs on one architecture but accidentally try to run it on another. Kevin On Thu, Feb 6, 2020 at 6:38 AM Seketoulie Keretsu wrote: > Dear Sir/Madam, > > We just installed gromacs 2019

Re: [gmx-users] Bilayer exploding with semiisotropic coupling

2020-01-04 Thread Kevin Boyd
Hi, Can you share more information? Please upload your starting configuration and a log file. On Fri, Jan 3, 2020 at 10:29 AM Namit Chaudhary wrote: > Hi, > > Sorry. I didn't realize that attachments aren't uploaded. Below is a link > for the files mentioned in the original mail. > >

Re: [gmx-users] Gromacs 2019 - Ryzen Architecture

2020-01-04 Thread Kevin Boyd
Hi, A few things besides any Ryzen-specific issues. First, your pinoffset for the second one should be 16, not 17. The way yours is set up, you're running on cores 0-15, then Gromacs will detect that your second simulation parameters are invalid (because from cores 17-32, core 32 does not exist)

Re: [gmx-users] Maximising Hardware Performance on Local node: Optimal settings

2019-12-07 Thread Kevin Boyd
Hi, I also wrote up some examples on optimizing for multiple simulations on the same node, see https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2019-July/126007.html On Wed, Dec 4, 2019 at 9:36 AM Christian Blau wrote: > Hi Matt, > > > Here are a few bullet points that might help

Re: [gmx-users] emtol criterion

2019-11-27 Thread Kevin Boyd
> For GROMACS, I think the emtol > value should be reported, but this varies by personal preference in most > papers, unfortunately. I had thought that emtol had no particular significance for minimizations used to set up typical MD simulations, as long as the system was sufficiently minimized as

Re: [gmx-users] Building different size membranes using gromacs 5.1.4

2019-08-08 Thread Kevin Boyd
Note that the solution Dallas suggested will work (along with changing the resulting box dimensions), but that it may lead to clashes at periodic boundaries. You may need to re-minimize (perhaps with soft-core potentials if there are serious clashes) and re-equilibrate, which would probably defeat

Re: [gmx-users] simulation on 2 gpus

2019-07-26 Thread Kevin Boyd
e a > subset of CPU/GPU to one run, and start another run later using another > unsubset of yet-unallocated CPU/GPU). Also, could you elaborate on the > drawbacks of the MPI compilation that you hinted at? > Gregory > > From: Kevin Boyd<mailto:kevin.b...@uconn.edu> > Sent:

Re: [gmx-users] simulation on 2 gpus

2019-07-25 Thread Kevin Boyd
Hi, I've done a lot of research/experimentation on this, so I can maybe get you started - if anyone has any questions about the essay to follow, feel free to email me personally, and I'll link it to the email thread if it ends up being pertinent. First, there's some more internet resources to

Re: [gmx-users] SMD Pull Direction

2019-06-24 Thread Kevin Boyd
Hi, You can accomplish that using pull-coord1-geometry=direction and an appropriate vector (0 1 0 for the positive y axis, etc). Kevin On Mon, Jun 24, 2019 at 3:15 PM Reza Esmaeeli wrote: > Dear Gromacs Users, > Is there any way to specify the direction of the pull along a certain axis > in

Re: [gmx-users] Timestep in GROMACS simulation

2019-05-28 Thread Kevin Boyd
Hi, That totally depends on your forcefield. Most classical all-atom forcefields are recommended to be run with a 2 femtosecond timestep. Some coarse-grained forcefields like Martini can do 20+ fs, but you’ll want to check the literature. As for better sampling, you simply sample longer times

Re: [gmx-users] non-water solvent + gmx solvate

2019-05-27 Thread Kevin Boyd
YJ9%2Bepq8BLlbqDTk%2FQk%3Dreserved=0), > > but Szilard assured me that it wasn't much of an issue. Indeed, the > build worked fine with our "usual" simulations. This one experiencing > issues (minor with 2018 and catastrophic with 2019) is new and it this > setup isn't expec

Re: [gmx-users] non-water solvent + gmx solvate

2019-05-27 Thread Kevin Boyd
h insert-molecules. > > Alex > > On 5/26/2019 8:31 AM, Kevin Boyd wrote: > > Hi, > > > > Which version are you using? As of 2019 gmx solvate should support > nonwater solvents and topology updating. > > > > If it’s not working with 2019, can you open up an

Re: [gmx-users] non-water solvent + gmx solvate

2019-05-26 Thread Kevin Boyd
Hi, Which version are you using? As of 2019 gmx solvate should support nonwater solvents and topology updating. If it’s not working with 2019, can you open up an issue on redmine.gromacs.org and upload your use files? I can take a look. Thanks, Kevin > On May 26, 2019, at 9:44 AM, Jones de

Re: [gmx-users] Gromacs 2019.2 on Power9 + Volta GPUs (building and running)

2019-05-01 Thread Kevin Boyd
vin On Wed, May 1, 2019 at 6:33 PM Alex wrote: > Of course, i am not. This is the EM. ;) > > On Wed, May 1, 2019, 4:30 PM Kevin Boyd wrote: > > > Hi, > > > > In addition to what Mark said (and I've also found pinning to be critical > > for performance), you're als

Re: [gmx-users] Gromacs 2019.2 on Power9 + Volta GPUs (building and running)

2019-05-01 Thread Kevin Boyd
Hi, In addition to what Mark said (and I've also found pinning to be critical for performance), you're also not using the GPUs with "-pme cpu -nb cpu". Kevin On Wed, May 1, 2019 at 5:56 PM Alex wrote: > Well, my experience so far has been with the EM, because the rest of the > script (with

Re: [gmx-users] Simulation is very slow

2019-03-14 Thread Kevin Boyd
Hi, We can't help without more information. Have you checked the log file to make sure the GPUs are being seen/used? Can you post a link to a sample log file? Kevin On Thu, Mar 14, 2019 at 11:57 AM 이영규 wrote: > Dear gromacs users, > > I installed gromacs 2019 today. When I run gromacs, it is

Re: [gmx-users] Simulation crashed - Large VCM, Pressure scaling more than 1%, Bond length not finite

2019-02-27 Thread Kevin Boyd
Hi, If it was something fundamentally wrong, you'd see an issue before this. Martini is just inherently a teensy bit unstable - but this is where the non-reproducibility of simulations comes in handy; restarting from far enough away will likely avoid a transiently high energy event. Kevin On

Re: [gmx-users] offloading PME to GPUs

2019-02-06 Thread Kevin Boyd
Hi, Your log file will definitely tell you whether PME was offloaded. The performance gains depend on your hardware, particularly the CPU/GPU balance. There have been a number of threads on this forum discussing this topic, if you search back through the gmx_user archives. The gist of it is that

Re: [gmx-users] Equilibration step not moving forward

2019-02-05 Thread Kevin Boyd
Hi, We can't help you unless you're more specific. What error is occurring? Kevin On Tue, Feb 5, 2019 at 1:30 AM Deepanshi wrote: > Hello, > I am trying to equilibrate a bilayer vesicle which I have prepared using > martini maker of charmm-GUI. The vesicle is made up of POPC and has around >

Re: [gmx-users] why .top file is not updated with added waters in "gmx solvate" if customised water.gro is provided?

2019-01-20 Thread Kevin Boyd
Hi, The hard-coded SOL reference that Justin mentioned has been fixed in Gromacs 2018.3 and 2019. If you upgrade your gromacs version, gmx solvate should work as intended. Kevin > On Jan 20, 2019, at 3:11 PM, Justin Lemkul wrote: > > > >> On 1/20/19 3:04 PM, ZHANG Cheng wrote: >> In the

Re: [gmx-users] How to use "define = -DPOSRES" in Gromacs 2018?

2019-01-14 Thread Kevin Boyd
To add to Mark's comments, it's commonly the case that you want to apply restraints based on the starting configuration, e.g. for restraining protein positions at the beginning of a run. As the warning message says, you can pass the same file to both -c and -r for these cases. Also, you're

Re: [gmx-users] Running simulation differences

2018-12-17 Thread Kevin Boyd
Hi, First, with those associated errors I wouldn't say that those differences are significant. More to your question, Gromacs simulations with default parameters are not generally reproducible. See the last 2 points in this section of the reference manaul:

Re: [gmx-users] Mean square displacement on Log-Log plot?

2018-12-10 Thread Kevin Boyd
Hi, If you're reporting a diffusion coefficient, they're probably looking for you to justify that you're out of the short-time subdiffusive regime. My experience is in bilayer simulations, where the MSD hits that regime typically in the time lag range of ~10-20 ns. For a qualitative estimate of

Re: [gmx-users] pcoupltype

2018-10-30 Thread Kevin Boyd
Hi, For membrane systems you typically want to use semi-isotropic pressure coupling. If instead you want to simulate *one* lipid (as a ligand) with a protein in solution, you should stick to isotropic pressure coupling. I've never heard of any anisotropic pressure coupling protocols in

Re: [gmx-users] gromacs+CHARMM36+GPUs induces protein-membrane system collapse?

2018-10-30 Thread Kevin Boyd
Hi, Can you send an edr and log file? I believe your constraints are nonstandard, iirc for charmm36 constraints should just be h-bonds, but I doubt that would cause this. Kevin On Tue, Oct 30, 2018 at 2:50 PM, Ramon Guixà wrote: > Hi Mark, thanks for the quick response > > No transition

Re: [gmx-users] Position Restraint File for grompp

2018-10-30 Thread Kevin Boyd
Hi, Depends on what kind of restraint you are trying to apply. If you're just trying to restrain head groups during the initial equilibration period, you can feed -r your actual structure, since the restraints should be relative to current positions. If instead you're following the pore-forming

Re: [gmx-users] How do i reopen a redmine ticket?

2018-10-19 Thread Kevin Boyd
Hi, I think everyone should have edit permissions on redmine issues. I just checked that I could edit the same post. Sure you were logged in? Anecdotally, I once had the same issue - it ended up being because I had logged in on one tab with a redmine page open but was trying to edit a different

Re: [gmx-users] Pressure variation in the z direction : Gromacs LS / gromacs 5

2018-10-19 Thread Kevin Boyd
Hi, You need to regenerate your tpr with gromac-ls. The trr format is stable between versions: you’d only need a new trajectory if you didn’t save velocities. Kevin > On Oct 19, 2018, at 10:37 AM, Candy Deck wrote: > > Dear Gromacs Users, > > I am actually working with gromacs 5. > I

Re: [gmx-users] How to fix the box size during the simulation?

2018-10-18 Thread Kevin Boyd
Hi If your membrane is curved, do you mean to ask how you can enforce membrane *shape* rather than size? If so, fixing the box dimensions may not help maintain shape, depending on curvature morphology involved. Kevin On Thu, Oct 18, 2018 at 5:20 AM lorenaz wrote: > Hi all, > > I am simulating

Re: [gmx-users] membrane pre-equilibration

2018-10-08 Thread Kevin Boyd
Hi, I'm not exactly sure what you're asking. In both cases (with protein or without protein), the charmm-gui provides you files which need to be minimized and equilibrated. Typically, the steps recommended in the README file are sufficient. The only requirement of these steps it to make your

Re: [gmx-users] membrane pre-equilibration

2018-10-07 Thread Kevin Boyd
Hi, You should just use charmm-gui's built in functionality to insert the protein, unless you have a good reason not to. Kevin. On Sun, Oct 7, 2018 at 7:23 AM Olga Press wrote: > Dear all, > I'm new in the field of simulating membrane-protein system in gromacs by > using charmm36-ff. > I've

Re: [gmx-users] GPU ERROR RUNING A SIMULATION

2018-09-26 Thread Kevin Boyd
Hi, We don't currently support energy groups on GPUs with the Verlet cutoff scheme - see the table linked below. http://manual.gromacs.org/documentation/2018/user-guide/cutoff-schemes.html To enable the simulation to run on GPUs, remove the energy groups line (and energy group exclusions, if

Re: [gmx-users] Lateral pressure

2018-09-21 Thread Kevin Boyd
Hi, There's a hacked version of Gromacs 4.5.5 that can calculate lateral pressure profiles in a rerun. See: https://mdstress.org/index.php/gromacs-ls/ Keep in mind that you'll need positions AND velocities saved to do the analysis properly, and read their documentation carefully. Kevin On

Re: [gmx-users] Regarding index .ndx files

2018-09-17 Thread Kevin Boyd
Hi! There are two key points here. 1) Index groups don't have to be mutually exclusive. So, you can have your 20+ indices, but also have an inclusive index that encompasses more or all of your atoms, which you can then use for tc (or comm removal, etc.). The only necessary thing is that the

Re: [gmx-users] Density of POPC membrane and lateral diffusion

2018-08-21 Thread Kevin Boyd
Hi, In general, I'd say that 20 ns is far too short for a membrane simulation, but how long of a simulation is needed depends on what you're trying to calculate - lipid tail dynamics are quite fast, but head group dynamics are significantly slower. For some membrane-protein interactions, slow

Re: [gmx-users] how to use backup files as an input file?

2018-08-19 Thread Kevin Boyd
Hi, The backup files are the exact same files you originally had, just renamed - they can be analyzed like any other gromacs files. I’d suggest renaming them again(for instance back to their original names) to avoid any confusion, but the content of e.g. #md300.trr.1# are identical to what

Re: [gmx-users] thread-MPI compatibility requirement?

2018-08-15 Thread Kevin Boyd
Hi, This isn't a problem. Thread-mpi is the built-in mpi parallelization packaged with gromacs. What that message is saying is that Gromacs will use the openmpi library on your system instead, which is what you want when running on multiple nodes. Kevin On Wed, Aug 15, 2018 at 1:50 PM, Jost,

Re: [gmx-users] Using multiple GPU's

2018-08-10 Thread Kevin Boyd
Hi, Can you post a link to your log file? Also, what version of gromacs are you using? Make sure that the documentation you are following corresponds to the right version of gromacs. If you are using v 5.1 (as the link suggests), strongly consider upgrading to 2018. Kevin On Fri, Aug 10, 2018

Re: [gmx-users] Question about water insertion

2018-08-03 Thread Kevin Boyd
Hi, You can play around with the -radius and -scale parameters if you're getting clashes you don't like. However, it seems like you really should be using gmx solvate. You could accomplish your goal with "gmx solvate -cs tip4p.gro -box 10.1103 10.34753 3.958 -maxsol 13853" Kevin On Fri,

Re: [gmx-users] Compilation issue, MacOS 10.13.5 - Gromacs 2018 - CUDA 9.2

2018-06-18 Thread Kevin Boyd
Hi, In general you're not supposed to mix C compilers. I've had linking errors in the past, eg with using different versions of GCC between the -DC_CMAKE_C_COMPILER and -DCUDA_HOST_COMPILER. See this post for a discussion.

Re: [gmx-users] Only 2 ns/day

2018-06-18 Thread Kevin Boyd
Hi, One source of poor performance is certainly that you don't have SIMD enabled. Try recompiling with SIMD enabled (the log file suggests AVX_128_FMA). If you are compiling on gromacs on the same node architecture that you plan to run gromacs on (and you really should be doing this), it should

Re: [gmx-users] Fibril not restrained?

2018-06-07 Thread Kevin Boyd
Hi, To apply the restraints in the topology you need to use the define field in the mdp file. For your case, the option would be "define = -DPOSRES_A". Otherwise that #ifdef statement will evaluate to false and the restraints won't be included. Also, are there end quotes around

Re: [gmx-users] Coarse Grained (Empty space In water)

2018-06-04 Thread Kevin Boyd
Hi, Why would you want to "ruin" a perfectly good nanoparticle? :) Could this be a visualization artifact? What kind of treatment of periodic boundary conditions are you applying prior to visualization? If you use a PBC option that makes your nanoparticle contiguous and not split across

Re: [gmx-users] Problem in installing GROMACS 2018

2018-05-26 Thread Kevin Boyd
Hi, Did you have a previous install of gromacs 5.1.2? If so, it’s potentially a case of you having two installations of gromacs, and the first one found by your os when you try to run gmx [command] is 5.1. Kevin > On May 27, 2018, at 12:10 AM, Ali Ahmed wrote: > > Dear

Re: [gmx-users] Simulation crashed, fatal error: Bond length not finite and warning: Pressure scaling more than 1%.

2018-05-25 Thread Kevin Boyd
> You equilibrations are probably too short. There are some pretty slow > processes in lipid membranes. The original poster stated that the system crashed after microseconds of simulation, so this is not the case. The pressure fluctuation message could be a red herring, with a system explosion

Re: [gmx-users] Simulating protein-lipid interactions

2018-05-23 Thread Kevin Boyd
CHARMM36-m is the most recent release of the CHARMM forcefield, with improved protein dynamics . You can find the force field files here: http://mackerell.umaryland.edu/charmm_ff.shtml#gromacs That webpage also has a list of relevant publications for you to look over. Alternatively, the

Re: [gmx-users] gromacs 2018 make error with multiple compilers and GPU support

2018-05-08 Thread Kevin Boyd
e, May 8, 2018 at 10:01 AM, Mark Abraham <mark.j.abra...@gmail.com> wrote: > On Tue, May 8, 2018 at 3:54 PM Kevin Boyd <kevin.b...@uconn.edu> wrote: > >> Thanks for the reply. >> >> The distro is actually relatively up to date, from what I can tell gcc >&

[gmx-users] gromacs 2018 make error with multiple compilers and GPU support

2018-05-07 Thread Kevin Boyd
Hi, I've been trying to install gromacs 2018 on a cluster running Centos7. In keeping with the guidelines for maximizing performance, I'm compiling with a recent (7.3.0) GCC version. However, Cuda 9.0 on Centos 7.x needs to be compiled with GCC 4.8.5, so my cmake command included

Re: [gmx-users] anyone have a development version or fork of gromacs that puts bonded interactions on the GPUs?

2018-04-05 Thread Kevin Boyd
Hi Chris, My experience has been that GPUs do significantly increase performance in Martini simulations, perhaps not quite as much as all-atom simulations but typically at least ~2x the speed of the same system on cpus alone. What combination of gromacs version/mdp options/hardware are you

Re: [gmx-users] Installation

2018-04-01 Thread Kevin Boyd
Hi Alex, I think by default cmake selects the first C-compiler it runs across while searching, and the cmake default search path may not coincide with the order of your environmental PATH variable, so when you have multiple versions it might not select the one you want. You can manually set the

Re: [gmx-users] Dose gpu support on flatbottom restraint

2018-03-31 Thread Kevin Boyd
Hi, I believe flat bottom potentials had a bug that affected GPUs on multiple ranks that was fixed in version 2016.4. http://manual.gromacs.org/documentation/2018.1/release-notes/2016/2016.4.html If you need to use Gromacs v5, I think the patch was applied to v5.1.5 as well.