Re: [gmx-users] polymer & peptide interaction pbc, visualization problem

2019-11-07 Thread p buscemi
I've run protein adsorptions on PE, PEO, nylon and the like. The way I approach such models is to first model the surface. Depending on if you want an "extruded" polymer or a cast surface will define how you restrain it. For most of my models - polymer strands of about 1000 atoms, and using

[gmx-users] gromos force field

2019-09-21 Thread p buscemi
Dear Users, I often use Gromos force field because ATB provides top files for reasonably large molecules - 1000 atoms. With Gromacs 2019.3 the error now appears: The GROMOS force fields have been parametrized with a physically incorrect multiple-time-stepping scheme for a twin-range cut-off.

[gmx-users] gpu usage

2019-08-20 Thread p buscemi
Dear Users, I am getting reasonable performance from two rtx -2080ti's - AMD 32 core and on another node two gtx-1080 ti's -AMD 16 core i.e 20-30 ns/day with 30 atoms. But in all my runs the % usage of the gpu's is typcially 40% to 60 % . Given that it is specialized software, I notice

Re: [gmx-users] volume not shrinking when restrains present

2019-06-05 Thread p buscemi
*** gen-temp = 200 Sent from Mailspring (https://link.getmailspring.com/link/1559783231.local-a7a15750-077e-v1.5.3-420ce...@getmailspring.com/0?redirect=https%3A%2F%2Fgetmailspring.com%2F=Z214LXVzZXJzQGdyb21hY3Mub3Jn), the best free email app for work On Jun 5 2019, at 12:48 pm, p buscemi wrote

Re: [gmx-users] Problem using RTX 2070

2019-06-05 Thread p buscemi
Armenio, I've used GTX 1060, 1070,1080 and RTX 2080. The RTX 2070 should also be compatible. The problem may lie in the build. Paul Sent from Mailspring

[gmx-users] volume not shrinking when restrains present

2019-06-05 Thread p buscemi
Dear Users, I've been trying to solve this for several days. A box with 100 polymer strands restrained in only the x direction on either end is built and under nvt the strands will coalesce into groups of 8-15 molecules. Under npt isotropic or semiisotropic or surface-tension, the box will

Re: [gmx-users] Use of Restraint itps

2019-05-22 Thread p buscemi
for work On May 22 2019, at 12:08 pm, Bratin Kumar Das <177cy500.bra...@nitk.edu.in> wrote: > Hi > You can use gmx genrestr tool to create restrain .itp file for any set > of atoms > > On Wed 22 May, 2019, 8:12 PM p buscemi, wrote: > > > > Dear Users, >

[gmx-users] Use of Restraint itps

2019-05-22 Thread p buscemi
Dear Users, In using restrain files, I place the restraint itp in a separated directory in which there may be other restraint files. I notice that within the restraint itp there is no specific reference to the molecule used to create the itp. I've run into an instance in which other than the

[gmx-users] charge density of multiple frames in a limited region

2019-05-09 Thread p buscemi
Dear Users, In using: gmx density -f lipid.nvt.trr -s octanoate.nvt3.tpr -dens charge -center -symm -sl 100 -b 4 -e 4.5 it appears that the average charge density of all frames and the total box size is used in the calculation. Is there a way to 1) specify multiple ranges to output say, every

[gmx-users] Pot'l energy difference between runs

2019-05-06 Thread p buscemi
Dear users, I've run a water membrane model through minimization, nvt, and npt with pressure coupling = surface tension with compression 4.5e-5 4 0 and with 4.5e-5 4.5e-5. I wanted to increase the box size in the z direction to created an air ( really vacuum ) interface. The membrane is in the

[gmx-users] error particles communicated, 2/3 cutoff, domain composition - with a twist

2019-04-15 Thread p buscemi
Deaer Users, I've gotten the apparently common error "particles communicated to PME rank 19 are more than 2/3 times the cut-off out of the domain decomposition cell of their charge group in dimension x. This usually means that your system is not well equilibrated." but with a twist. The 5 ns npt

Re: [gmx-users] gromacs instillation

2019-04-11 Thread p buscemi
when you open a new terminal you need to run " source" again, but you probably know this by now or add the path to your /etc/.profile to make the change permanent Sent from Mailspring

[gmx-users] Energy from a subgroup of molecules

2019-04-10 Thread p buscemi
Dear Users, I've performed an adsorption experiment in which a fraction of molecules in solution adsorb to a surface. I can extract the index of those adsorbed, and I can obtain the total interaction ( LJ ) of the energy group with the surface. I can estimate the average interaction of the

Re: [gmx-users] Results of villin headpiece with AMD 8 core

2019-01-14 Thread p buscemi
Mirco, to continue the results from the 32 core AMD Ryzen 1080ti 8.4 % of the available CPU time was lost due to load imbalance in the domain decomposition. Core t (s) Wall t (s) (%) Time: 151131.597 2361.432 6400.0 39:21 (ns/day) (hour/ns) Performance: 7.318 3.280 command gmx mdrun -deffnm

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread p buscemi
core 2990 AMD TR. DLB is taking some time and I will be tuning the system today. but it works. Results for 80k atoms will be reported. Thank you all. Paul Sent from Mailspring (https://getmailspring.com/), the best free email app for work On Dec 19 2018, at 11:36 am, p buscemi wrote: > Gett

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread p buscemi
ters right now. > Paul > -Original Message- > From: gromacs.org_gmx-users-boun...@maillist.sys.kth.se > On Behalf Of Justin > Lemkul > Sent: Wednesday, December 19, 2018 10:47 AM > To: Discussion list for GROMACS users > Subject: Re: [gmx-users] error on opening gmx_mpi

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread p buscemi
, please check the GROMACS website at http://www.gromacs.org/Documentation/Errors Sent from Mailspring (https://getmailspring.com/), the best free email app for work On Dec 19 2018, at 10:04 am, p buscemi wrote: > here is the output from the gmx_mpi command. I would think the correct > v

Re: [gmx-users] error on opening gmx_mpi

2018-12-19 Thread p buscemi
rom. it may be easier purge > everything and start again. > Paul > > On Dec 18, 2018, at 8:48 PM, Shi Li wrote: > > > > > > Message: 3 > > > Date: Tue, 18 Dec 2018 15:12:00 -0600 > > > From: p buscemi > > > To: "=?utf-8?Q?gmx-use

[gmx-users] error on opening gmx_mpi

2018-12-18 Thread p buscemi
I installed 2019 beata gmx_mpi with: cmake .. -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=on -DCMAKE_CXX_COMPILER=/usr/bin/g++-7 -DCMAKE_C_COMPILER=/usr/bin/gcc-7 -DGMX_MPI=ON -DGMX_USE_OPENCL=ON The install completed with no errors. I need to take this step by step: in

Re: [gmx-users] using dual CPU's

2018-12-13 Thread p buscemi
Carsten thanks for the suggestion. Is it necessary to use the MPI version for gromacs when using multdir? - now have the single node version loaded. I'm hammering out the first 2080ti with the 32 core AMD. results are not stellar. slower than an intel 17-7000 But I'll beat on it some more

Re: [gmx-users] using dual CPU's

2018-12-10 Thread p buscemi
Thank you, Mark, for the prompt response. I realize the limitations of the system ( its over 8 yo ), but I did not expect the speed to decrease by 50% with 12 available threads ! No combination of ntomp, ntmpi could raise ns/day above 4 with two GPU, vs 6 with one GPU. This is actually a