[gmx-users] NAG N-Acetylglucosamine force field parameters

2020-04-06 Thread Javier Luque Di Salvo
Dear Gromacs users,

Do you know where can I search for force field parameters of
N-Acetylglucosamine (NAG)? I'm interested in AMBER, CHARMM and GROMOS force
fields, my Gromacs version is 2018.3

Thanks in advance,

Javier

-- 



*Javier Luque Di Salvo*

Dipartamento di Ingegneria Chimica

Universitá Degli Studi di Palerm*o*
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

[gmx-users] Protonation state of aminoacids from pdb and pdb2gmx

2020-04-06 Thread Javier Luque Di Salvo
Dear Gromacs users,

I'm aware that there might be plenty of discussions on this matter, anyway
I would like to get your advice. Which is the proper way to know the
protonation state that is generated with pdb2gmx? I have downloaded a pdb
file from the Protein Data Bank. The structure has no hydrogens, these are
added by pdb2mx.

Is this information usually available on the original pdb (like residue
names) or in the generated topology? I would like to know where to search.

Javier

-- 



*Javier Luque Di Salvo*

Dipartamento di Ingegneria Chimica

Universitá Degli Studi di Palerm*o*
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] salvation free energy - gmx bar

2019-02-19 Thread Javier Luque Di Salvo
Hello Shadfar,

Check your mdp files for missing/repeated init-lambda-state. I had the same
problem, in my case I had two xvg file corresponding to the same
init-lambda-state and in consequence one of them was missing

Cheers
Javier

Date: Thu, 11 Jan 2018 23:33:28 +
> From: "Shadfar, Shamim" 
> To: "gmx-us...@gromacs.org" 
> Subject: [gmx-users] salvation free energy - gmx bar
> Message-ID:
> <
> mexpr01mb02309bf58f609357a035f9dbc3...@mexpr01mb0230.ausprd01.prod.outlook.com
> >
>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hello everyone,
>
>
> I am doing solvation free energy calculation for a small system
> (Thermodynamic Integration). At the analysis step, gmx bar (gmx bar -f
> md*.xvg -o -oi -oh), I am facing this error, which I never had before. I
> have all my md.xvg files.
>
> Could anyone help me with that?
>
> Here is the error:
>
>
>
> md9.xvg: Ignoring set 'pV (kJ/mol)'.
>
> md9.xvg: 0.0 - 1.0; lambda = (0, 0.9, 0, 0, 0)
>
> dH/dl & foreign lambdas:
>
> dH/dl (mass-lambda) (51 pts)
>
> dH/dl (coul-lambda) (51 pts)
>
> dH/dl (vdw-lambda) (51 pts)
>
> dH/dl (bonded-lambda) (51 pts)
>
> dH/dl (restraint-lambda) (51 pts)
>
> delta H to (0, 0.8, 0, 0, 0) (51 pts)
>
> delta H to (0, 0.9, 0, 0, 0) (51 pts)
>
> delta H to (0, 1, 0, 0, 0) (51 pts)
>
>
> Writing histogram to histogram.xvg
>
>
> Back Off! I just backed up histogram.xvg to ./#histogram.xvg.1#
>
>
> ---
>
> Program: gmx bar, version 2016.3
>
> Source file: src/gromacs/gmxana/gmx_bar.cpp (line 1174)
>
>
> Fatal error:
>
> There is no path between the states X & Y below that is covered by foreign
>
> lambdas:
>
> cannot proceed with BAR.
>
> Use thermodynamic integration of dH/dl by calculating the averages of dH/dl
>
> with g_analyze and integrating them.
>
> Alternatively, use the -extp option if (and only if) the Hamiltonian
>
> depends linearly on lambda, which is NOT normally the case.
>
>
> lambda vector [X]:  init-lambda-state=5 (mass-lambda) l=0 (coul-lambda)
> l=0.5
>
> (vdw-lambda) l=0 (bonded-lambda) l=0 (restraint-lambda) l=0
>
> lambda vector [Y]:  init-lambda-state=7 (mass-lambda) l=0 (coul-lambda)
> l=0.7
>
> (vdw-lambda) l=0 (bonded-lambda) l=0 (restraint-lambda) l=0
>
> For more information and tips for troubleshooting, please check the GROMACS
>
> website at http://www.gromacs.org/Documentation/Errors
>
>
>
> Shamim Shadfar
>
> PhD Candidate at Massey University
> Institute of Natural and Mathematical Sciences (INMS)
> Centre of Theoretical Chemistry and Physics (CTCP)
>
> Auckland, New Zealand
>
> 
>
>
>
>
> --
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
> End of gromacs.org_gmx-users Digest, Vol 165, Issue 41
> **
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Pressure annealing?

2018-12-12 Thread Javier Luque Di Salvo
Dear Gromacs users,

During NpT runs, is it possbile to increase/ decrease in a controlled way
the box vectors (user-defined, the same as in temperature annealing)?

Kind regards

*Javier Luque Di Salvo*
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Merge pdb files into single box

2018-09-20 Thread Javier Luque Di Salvo
Thanks Justin,

Then playing with cat, box dimensions and editconf + translations +
renumbering atoms/residues if needed should work.

Best regards

>
> On 9/20/18 10:28 AM, Javier Luque Di Salvo wrote:
> > Dear users,
> >
> > Browsing in the past threads I found that previous versions had the 'cat'
> > tool to merge two (or more) pdb structure files. I was wondering if this
> > tool is still available in current versions? I am using version 5.0.7
>
> cat is a Linux command, not a GROMACS program.
>
> -Justin
>
> > The pdb structures I would like to merge are very different from each
> other
> > and not-bonded among them, all I need is to put them inside the same
> > simulation box with no overlapping. Since the systems are quite big, I
> > would like to avoid manual manipulation with a molecular editor. A few
> > tests with trjcat gave me error because the systems are of different
> > composition, maybe I need to use trjcat with further options?
> >
> > Best regards
>
> --
> ==
>
> Justin A. Lemkul, Ph.D.
> Assistant Professor
> Virginia Tech Department of Biochemistry
>
> 303 Engel Hall
> 340 West Campus Dr.
> Blacksburg, VA 24061
>
> jalem...@vt.edu | (540) 231-3129
> http://www.thelemkullab.com
>
> ==
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Merge pdb files into single box

2018-09-20 Thread Javier Luque Di Salvo
Dear users,

Browsing in the past threads I found that previous versions had the 'cat'
tool to merge two (or more) pdb structure files. I was wondering if this
tool is still available in current versions? I am using version 5.0.7

The pdb structures I would like to merge are very different from each other
and not-bonded among them, all I need is to put them inside the same
simulation box with no overlapping. Since the systems are quite big, I
would like to avoid manual manipulation with a molecular editor. A few
tests with trjcat gave me error because the systems are of different
composition, maybe I need to use trjcat with further options?

Best regards
-- 
Javi
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Calculate and print ALL distances between atoms A and B from structure file (i.e. not over time), with detailed information

2018-08-27 Thread Javier Luque Di Salvo
Dear users,

I am trying to calculate ALL distances between two types of atoms, say atom
A and atom B. The system is a polymer composed of 20 linear chains. Each
chain has 1 (one) atom A that can form a crosslink with some atom B of an
adjacent, near chain. Thus, in order to generate such crosslinks
(self-crosslinks are not necessarily excluded), I run NVT and NPT
simulations to compress the linear chains and bringing them together. At
some point, when the chains are closer, I need to analyze distances A-B
that fall within a certain cutoff, in order to decide if a bond can be
formed and added following the specbonds.dat topology directives.

I tried with mindist, distance, pairdist, but I was not able to generate an
output that specifies all the distances among a given atom A and all atoms
B. This output should contain either the atom number/ chain/ residue of
each atom involved in the distance calculation. Atom type B is present on
each linear chain 15 times; if no cutoff is specified, the output should
contain (15x20) distances for each A atom; i.e. 15x20x20 distances.

Currently I can identify only the minimum distance between atom A and B,
but I am a bit lost in how to generate the more detailed output as
described above. Any guidance is appreciated.

Best regards



*Javier Luque Di Salvo*
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Free Energy solvation generate multiple dhdl.xvg output files

2018-03-20 Thread Javier Luque Di Salvo
Dear Wes,

Thanks for clarifying, I did as you told and it worked. The resulting
dG_hyd of NH4+ is -62.5 kJ/mol, while the experimental value is -285 kJ/mol
(10.1039/FT9918702995). I am using OPLS and SPC/E water. Energy values of
the last lambdas showed higher error bars, so now I am refining the last
part by increasing the number of lambda points:
fep-lambdas  = 0.0 0.2 0.4 0.6 0.8 0.85 0.9 0.92 0.94 0.96 0.98
1.0
I do not expect big differences. I could also try (i) other water model,
(ii) bigger box with more NH4+ ions (same concentration) or (ii-b) increase
NH4+ concentration, (iii) adding anions (like Cl-). Regarding these
options, I am not sure yet if they have sense (meaning that there may be
another more suitable option, while some of the possible tests I wrote may
not result in significative improvements), so if there are any comments
regarding these options they are welcome.

Also, I do not understand why the Nitrogen partial charge is negative,
shouldnt be positive? I choose OPLS since it was parameterized for liquid
simulations, but I found in literature that usually there are problems for
correctly simulating dG_hyd of amines.
 atom charge   mass
 N00-0.518314.0070
 H01 0.3796 1.0080
 H02 0.3796 1.0080
 H03 0.3796 1.0080
 H04 0.3796 1.0080
Maybe the best option is to change FF (does someone know about an
acceptable FF -preferably all_atoms- for amines/ ammonium derivatives R4N+)
?

Best regards
Javier

Date: Mon, 19 Mar 2018 13:50:19 -0400
> From: Wes Barnett 
> To: gmx-us...@gromacs.org
> Subject: Re: [gmx-users] Free Energy solvation generate multiple
> dhdl.xvg output files
> Message-ID:
> 

[gmx-users] Free Energy solvation generate multiple dhdl.xvg output files

2018-03-19 Thread Javier Luque Di Salvo
Dear Gromacs users,
I am trying to run Free Energy calculations of a single ion in water (to
calculate its solvation energy). Since this is a new topic for me, I am
following the tutorial 'Solvation free energy of ethanol' (easily find in
google). The three steps work well without erros (1-energy minimization,
2-equilibration, 3-lambda (de)coupling) but I am not obtaining the separate
dhdl.xvg files after the step 3, only a single output file run.xvg. The
reason for this is that in the tutorial, a script named mklambdas.sh is
used to generate such separate files in separate folders. Since I would
like to keep track of the .mdp strings and understand how they work, I am
not using this script, instead I am trying to properly define mdp commands:

_
; Free energy variables
free-energy  = yes
couple-moltype   = NH4
couple-lambda0   = none
couple-lambda1   = vdwq
couple-intramol  = no
init-lambda  = -1
init-lambda-state= 0
delta-lambda = 0
nstdhdl  = 50
fep-lambdas  = 0.0 0.2 0.4 0.6 0.8 0.9 1.0
mass-lambdas =
coul-lambdas =
vdw-lambdas  =
bonded-lambdas   =
restraint-lambdas=
temperature-lambdas  =
calc-lambda-neighbors= 1
init-lambda-weights  =
dhdl-print-energy= no
sc-alpha = 1.0
sc-power = 1
sc-r-power   = 6
sc-sigma = 0.3
sc-coul  = no
separate-dhdl-file   = yes
dhdl-derivatives = yes
dh_hist_size = 0
dh_hist_spacing  = 0.1
_

When trying to use "gmx bar -f run.xvg -g run.edr -o bar.xvg" I get this
error:
___
run.xvg: Ignoring set 'pV (kJ/mol)'.
run.xvg: 0.0 - 200.0; lambda = 0
dH/dl & foreign lambdas:
dH/dl (fep-lambda) (2001 pts)
delta H to 0 (2001 pts)
delta H to 0.2 (2001 pts)

Opened run.edr as single precision energy file
Reading energy frame  0 time0.000
---
Program: gmx bar, version 2016.4
Source file: src/gromacs/gmxana/gmx_bar.cpp (line 3286)
Fatal error:
Did not find delta H information in file run.edr


This is the run.xvg output file:


# Command line:
#   gmx mdrun -ntmpi 2 -ntomp 8 -pin on -v -deffnm run
# gmx mdrun is part of G R O M A C S:
#
# S  C  A  M  O  R  G
#
@title "dH/d\xl\f{} and \xD\f{}H"
@xaxis  label "Time (ps)"
@yaxis  label "dH/d\xl\f{} and \xD\f{}H (kJ/mol [\xl\f{}]\S-1\N)"
@TYPE xy
@ subtitle "T = 300 (K) \xl\f{} state 0: fep-lambda = 0."
@ view 0.15, 0.15, 0.75, 0.85
@ legend on
@ legend box on
@ legend loctype view
@ legend 0.78, 0.8
@ legend length 2
@ s0 legend "dH/d\xl\f{} fep-lambda = 0."
@ s1 legend "\xD\f{}H \xl\f{} to 0."
@ s2 legend "\xD\f{}H \xl\f{} to 0.2000"
@ s3 legend "pV (kJ/mol)"
0. -530.14746 0.000 -106.84067 9.7838984
0.1000 -177.17302 0.000 -33.945797 9.7844772
0.2000 -158.46501 0.000 -30.541293 9.7854595
0.3000 -143.72632 0.000 -27.669909 9.7857714
(...)




Could you guide me through a proper string definition in the mdp options to
get separate files, in order to use the BAR algorithm afterwards?

Best regards

*Javier*
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Performance test

2017-11-27 Thread Javier Luque Di Salvo
Dear community,
I share the results of scaling- performance test. I used this command and
checked the core usage with the help of htop tool (http://hisham.hm/htop/):
gmx mdrun -ntmpi 1 -ntomp N -pin on -deffnm  &

Where N is the number of (logical) cores, and hardware is Intel(R) Core(TM)
i7-6700 @3.40 GHz, 16 GB RAM and no GPU. I tested two polymer chains of
different size (psu10= 552 atoms; psu36= 1956 atoms), 1ns NPT simulations
of the previously-equilibrated system, md settings were Berendsen
thermostat and barostat, V-rescale, time step 1fs, rcut-off 1.0 nm, PME for
coulombic computation. In this link the figures:
https://goo.gl/bVZKcU

And the table with the values if problems in opening the figures:
PSU10 (552 atoms)
N   wall-time  ns/day
1   1057.166   81.7025
2   631.117  136.908
3   461.265  187.448
4   352.821  244.886
5   440.070  196.393
6   386.782  223.346
7   348.273  248.083
8   389.243  255.187
--
PSU36 (1956 atoms)
N   wall-time  ns/day
1   2259.990  38.231
2   1254.619  68.870
3   875.394   99.267
4   672.042   128.570
5   822.385   105.056
6   712.061   121.338
7   628.172   137.551
8   576.145   149.963

Kind regards,
Javi

2017-11-21 13:50 GMT+01:00 Javier E <jluquedisa...@gmail.com>:

> Dear users,
>
> I'm doing a performance analysis following this link
> http://manual.gromacs.org/documentation/5.1/user-guide/
> mdrun-performance.html and wanted to ask:
>
> Is there a "standard" procedure to test performance in gromacs (on single
> nodes, one multi-processor CPU)? Following there are some results, the
> system is a small polymeric chain of 542 atoms with no water and NPT 100 ps
> runs (if more information about md settings are needed please ask):
>
> Running on 1 node with total 4 cores, 8 logical cores
> Hardware detected:
>   CPU info:
> Vendor: GenuineIntel
> Brand:  Intel(R) Core(TM) i7-6700 CPU @ 3.40GHz
> SIMD instructions most likely to fit this hardware: AVX2_256
> SIMD instructions selected at GROMACS compile time: AVX2_256
>
> GROMACS version: VERSION 5.1.4
> Precision:single
> Memory model:64 bit
> MPI library:  thread_mpi
> OpenMP support: enabled (GMX_OPENMP_MAX_THREADS = 32)
> GPU support:   disabled
> OpenCL support:  disabled
>
>
> gmx mdrun -ntmpi 1 -ntomp # -v -deffnm out.tpr
>
>
> ___
> -ntomp  | MPI / OpenMP- | Wall time (s)|ns/day   |  % CPU   |
> Note?* |
> 
> -
>1|  1/1 | 1075.764  |80.315|
> 100.0   |   No   |
>2|  1/2 |  619.679   |   139.427   |
> 200.0   |   Yes |
>3|  1/3 |  458.721   |   188.350   |
> 299.9   |   Yes |
>4|  1/4 |  356.906   |   242.081   |
> 399.8   |   Yes |
>5|  1/5 |  433.572   |   199.275   |
> 499.0   |   Yes |
>6|  1/6 |  378.951   |   227.998   |
> 598.0   |   Yes |
>7|  1/7 |  355.785   |   242.844   |
> 693.1   |   Yes |
>8|  1/8 (default)|  328.520   |   262.081   |
> 779.0   |   No   |
> 
> -
>
> *NOTE: The number of threads is not equal to the number of (logical) cores
>   and the -pin option is set to auto: will not pin thread to cores.
>
>
> If (MPI-Threads)*(OpenMP-Threads) = number of threads, does mdrun uses
> number of cores= number of threads, and this can be seen in the %CPU usage?
>
> For example, as I installed GROMACS in default, the GMX_OpenMP_MAX_THREAD
> is set at 32 threads, but this will never happen with this hardware (4
> cores, 8 logical), is this correct? By now I'm re-running the exact same
> tests to have at least one replica, and extending the system size and the
> and run time. Any suggestions on how to deep further in this kind of tests
> are welcome,
>
> Best regards
> --
>
> 
>
> *Javier Luque Di Salvo*
>
> Dipartamento di Ingegneria Chimica
>
> Universitá Degli Studi di Palermo
> *Viale delle Scienze, Ed. 6*
> *90128 PALERMO (PA)*
> *+39.09123867503 <+39%20091%202386%207503>*
>



-- 



*Javier Luque Di Salvo*

Dipartamento di Ingegneria Chimica

Universitá Degli Studi di Palermo
*Viale delle Scienze, Ed. 6*
*901

[gmx-users] Workstation configuration advices?

2017-11-10 Thread Javier Luque Di Salvo
Dear users and developers,


I am about to acquire a Workstation to mainly run GROMACS. In choosing the 
right configuration that fits the budget, I came out with this basic arrange 
(from Dell - Precision 7820 Tower):

- Processor: Intel Xeon Gold 6130 2.1 GHz 3.7 GHz Turbo, 16Core 10.4 GT/s 2 
UPI, 22MB cache, HT (125W) DDR-2666. Or any other similar of the Intel Xenon 
(R) type.

- Graphic card: NVIDIA Quadro P4000, 8GB, 4DP (7x20TB): even if I would like a 
Tesla P100, anyone has used Quadro cards or knows about its acceleration in 
GROMACS? Also, are Raedon graphic cards compatible/ tested for GROMACS (I 
didn't check if they support CUDA).

- Memory 32 GB (2x16 GB or 4x8GB?) It wont be better 16GB of RAM and maybe 
18Cores, 20Cores? The price rises very easy when adding only two-four cores! 
The same for the graphic card!

- Primary Hard disk: Solid state (SSD) Class 50 of 512 GB.

About the primary hard disk, should it be NVMe or SATA SSDs? I saw that NVMe 
has much better I/O communication but since its relative new tech, I don't know 
if GROMACS can fully run here, I believe it does. Also, NVMe it doesn't support 
RAID (is RAID a must for GROMACS?), and I'm afraid to encounter basic problems 
such as not being able to properly install the OS and the right dependencies/ 
libraries/ related tools-software. http://info.grcooling.com/nvme-vs-sata-ssd
- Secondary disk: SATA 3.5'' 1TB 7200 rpm.


I'm intended to simulate polyelectrolyte hydrocarbon-polymers, its water uptake 
and ion-diffusion, in simulation boxes of a maximum about 10x10x10 nm, but most 
of the work would be in much smaller dimensions.


Any comments are highly appreciated.

Best regards




Javier Luque Di Salvo

Dipartamento di Ingegneria Chimica

Universitá Degli Studi di Palermo

Viale delle Scienze, Ed. 6
90128 PALERMO (PA)
+39.09123867503

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.