Re: [gmx-users] cannot load the xtc file (without PBC) to VMD

2020-01-24 Thread Lalehan Ozalp
n't get you. So your problem is solved using trr file or not? > - Message from Lalehan Ozalp - > Date: Tue, 14 Jan 2020 12:53:54 +0300 > From: Lalehan Ozalp > Reply-To: gmx-us...@gromacs.org > Subject: Re: [gmx-users] cannot load the xtc file (without PBC) to VMD

Re: [gmx-users] cannot load the xtc file (without PBC) to VMD

2020-01-14 Thread Lalehan Ozalp
Thank you, but I haven't tried with a trr file and actually the last .trr file I've produced is em.trr. Thanks, On Mon, Jan 13, 2020 at 5:40 PM wrote: > Hii > Have you tried with the .trr file? Try the same procedure with .trr file. > Thanks > - Message from L

[gmx-users] cannot load the xtc file (without PBC) to VMD

2020-01-13 Thread Lalehan Ozalp
Dear all, I have a question regarding loading .xtc file (with no periodic boundary conditions) to VMD which I generated before. To this end, I had used the command: trjconv -f md_0_10.xtc -s md_30.tpr -n index.ndx -o noPBC.xtc -pbc mol -ur compact -center and selected *protein* for centering, and

Re: [gmx-users] the a.u. unit in the RMS distribution graph

2019-12-01 Thread Lalehan Ozalp
Thank you Justin, that saved me a lot of time. Best, Lalehan On Sun, Dec 1, 2019 at 3:45 PM Justin Lemkul wrote: > > > On 12/1/19 3:48 AM, Lalehan Ozalp wrote: > > Hello Christian, thank you for providing the patch. However I wonder if > > there is an easier way to con

Re: [gmx-users] the a.u. unit in the RMS distribution graph

2019-12-01 Thread Lalehan Ozalp
Hello Christian, thank you for providing the patch. However I wonder if there is an easier way to convert a.u to frequency - as I need to install GROMACS from the start to install the patch. One more thing, I used version 5.0, not 2019, and the patch is written for the version 2019. Thank you

Re: [gmx-users] the a.u. unit in the RMS distribution graph

2019-11-29 Thread Lalehan Ozalp
Hello Justin, thank you for the response. In that case I should use "frequency" if I plan to take it the way it is. Thanks, On Fri, Nov 29, 2019 at 4:25 PM Justin Lemkul wrote: > > > On 11/29/19 7:18 AM, Christian Blau wrote: > > Hello Lalehan, > > > > > > a.u. stands for "arbitrary units". >

Re: [gmx-users] the a.u. unit in the RMS distribution graph

2019-11-29 Thread Lalehan Ozalp
lues, > you can read the a.u. as counts per > length-interval. > > > Best, > > Christian > > On 2019-11-29 12:57, Lalehan Ozalp wrote: > > Hello everyone, I ran a cluster analysis for a 10 ns simulation and > > produced rmsd-clust.xpm and rmsd-dist.xvg graphs. When I op

[gmx-users] the a.u. unit in the RMS distribution graph

2019-11-29 Thread Lalehan Ozalp
Hello everyone, I ran a cluster analysis for a 10 ns simulation and produced rmsd-clust.xpm and rmsd-dist.xvg graphs. When I open the xvg file, I see "a. u." in the y axis which I couldn't entirely understand. Is it supposed to stand for atomic unit for length or mass or something else? I provide

Re: [gmx-users] error of missing atoms due to an incorrectness in the .hdb file

2019-05-26 Thread Lalehan Ozalp
in the pdb, not in the rtp, not itp nor hdb file. This is absolutely a black box to me. Thanks again! On Thu, May 23, 2019 at 12:07 AM Justin Lemkul wrote: > > > On 5/22/19 4:14 PM, Lalehan Ozalp wrote: > > Hello all, > > > > I am trying to run a simulation with a p

[gmx-users] error of missing atoms due to an incorrectness in the .hdb file

2019-05-22 Thread Lalehan Ozalp
Hello all, I am trying to run a simulation with a protein involving a covalently bound FAD cofactor. I followed the steps provided on the website http://www.gromacs.org/Documentation/How-tos/Adding_a_Residue_to_a_Force_Field and -I think- added the hydrogens in the .hdb file accordingly. While

[gmx-users] Best method to calculate binding free energy with GROMACS version 2016.3

2019-04-25 Thread Lalehan Ozalp
Hi everyone, I've run several simulations with a set of 6 ligands and an enzyme (30 ns) in water. I want to employ a method to calculate binding free energy of the ligands but I'm aware that MMPBSA is not compatible with version 2016.3. The forcefield I've used is charmm36-nov2018.ff. I know I

Re: [gmx-users] how to increase GMX_OPENMP_MAX_THREADS

2019-02-27 Thread Lalehan Ozalp
Dear Szilárd, They most certainly are clear! I originally thought the GPU of the terminal would be very useful. Your suggestions are of great help. Thank you so much! Best wishes, Lalehan -- Gromacs Users mailing list * Please search the archive at

Re: [gmx-users] how to increase GMX_OPENMP_MAX_THREADS

2019-02-27 Thread Lalehan Ozalp
Dear Szilárd, There is indeed one GPU. And please keep in mind I used to exploit the -nt 72 option BEFORE the 2019-dev version. It looks like it employs GPU by default and I don't know how to efficiently use it, apparently. Here is the info you asked for: System size: 130655 atoms .mdp file: ;

Re: [gmx-users] how to increase GMX_OPENMP_MAX_THREADS

2019-02-27 Thread Lalehan Ozalp
Dear Carsten, thank you for your advice. I'd tried gmx mdrun -deffnm md_0_30 -ntmpi 4 -ntomp 18 -npme 1 -pme gpu -nb gpu, then tried -ntmpi 8 -ntomp 9, later tried -ntmpi 12 -ntomp 6 (the rest of the command being the same) and I feel like neither of them make much of a difference. The CPU

Re: [gmx-users] how to increase GMX_OPENMP_MAX_THREADS

2019-02-25 Thread Lalehan Ozalp
Dear Szilárd, Thank you for your response. I'm basically simulating an enzyme including a cofactor and a ligand that is of my interest. I'm trying to observe the ligand's behaviour through a 30 ns trajectory. I'm confused by what you said. I used to employ the -nt option according to the old

Re: [gmx-users] how to increase GMX_OPENMP_MAX_THREADS

2019-02-24 Thread Lalehan Ozalp
Thank you for your reply. Next time I build GROMACS I'll keep that in mind. The funny thing is; nobody in the lab upgraded GROMACS. Hence the complications. Best regards, -- Gromacs Users mailing list * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List

[gmx-users] how to increase GMX_OPENMP_MAX_THREADS

2019-02-20 Thread Lalehan Ozalp
Hello all, I'd been running simulation with GROMACS 2018 using 72 open mpi threads without problem until (I assume) it was updated to 2019 version. When I execute mdrun with option -nt 72 (which is the number of cores of my terminal) it says: "you are using 72 openmp threads, which is larger than