Re: [gmx-users] Need help with installation of Gromacs-2019.3 with Intell compilers

2019-10-08 Thread Lyudmyla Dorosh
I have tried this command line:
sudo cmake .. -DBUILD_SHARED_LIBS=OFF -DGMX_FFT_LIBRARY=mkl
-DCMAKE_INSTALL_PREFIX=$installDir -DGMX_MPI=ON -DGMX_OPENMP=ON
-DGMX_CYCLE_SUBCOUNTERS=ON -DGMX_GPU=OFF -DGMX_SIMD=SSE2
-DCMAKE_C_COMPILER="/home/doroshl/apps/intel/bin/icc"
-DCMAKE_CXX_COMPILER="/home/doroshl/apps/intel/bin/icpc"
-DREGRESSIONTEST_DOWNLOAD=ON
which had no errors for *cmake* or *make -j 4*, but *make check* gave me an
error:
...
[100%] Running all tests except physical validation
Test project /home/doroshl/gromacs-2019.3/build
  Start  1: TestUtilsUnitTests
 1/46 Test  #1: TestUtilsUnitTests ..***Failed0.00 sec
/home/doroshl/gromacs-2019.3/build/bin/testutils-test: error while loading
shared libraries: libmkl_intel_lp64.so: cannot open shared object file: No
such file or directory
...
0% tests passed, 46 tests failed out of 46

so I included libmkl_intel_lp64.so:
sudo cmake .. -DBUILD_SHARED_LIBS=OFF -DGMX_FFT_LIBRARY=mkl
-DCMAKE_INSTALL_PREFIX=$installDir
-DMKL_LIBRARIES="/home/doroshl/apps/intel/compilers_and_libraries_2019.5.281/linux/mkl/lib/intel64_lin/libmkl_intel_lp64.so"
-DMKL_INCLUDE_DIR="/home/doroshl/apps/intel/intelpython2/lib"
-DCMAKE_CXX_LINK_FLAGS="-Wl,-rpath,/usr/bin/gcc/lib64 -L/usr/bin/gcc/lib64"
-DGMX_MPI=ON -DGMX_OPENMP=ON -DGMX_CYCLE_SUBCOUNTERS=ON -DGMX_GPU=OFF
-DGMX_SIMD=SSE2 -DCMAKE_C_COMPILER="/home/doroshl/apps/intel/bin/icc"
-DCMAKE_CXX_COMPILER="/home/doroshl/apps/intel/bin/icpc"
-DREGRESSIONTEST_DOWNLOAD=ON &> cmake.out
which doesn't give any error messages for cmake, but then in *sudo make -j
4 *results in

[ 46%] Building CXX object
src/gromacs/CMakeFiles/libgromacs.dir/pulling/pullutil.cpp.o
icpc: error #10105:
/home/doroshl/apps/intel/compilers_and_libraries_2019.5.281/linux/bin/intel64/mcpcom:
core dumped
icpc: warning #10102: unknown signal(694380720)
icpc: error #10106: Fatal error in
/home/doroshl/apps/intel/compilers_and_libraries_2019.5.281/linux/bin/intel64/mcpcom,
terminated by unknown
compilation aborted for
/home/doroshl/gromacs-2019.3/src/gromacs/pulling/pullutil.cpp (code 1)
src/gromacs/CMakeFiles/libgromacs.dir/build.make:2136: recipe for target
'src/gromacs/CMakeFiles/libgromacs.dir/pulling/pullutil.cpp.o' failed
make[2]: *** [src/gromacs/CMakeFiles/libgromacs.dir/pulling/pullutil.cpp.o]
Error 1
make[2]: *** Waiting for unfinished jobs
CMakeFiles/Makefile2:2499: recipe for target
'src/gromacs/CMakeFiles/libgromacs.dir/all' failed
make[1]: *** [src/gromacs/CMakeFiles/libgromacs.dir/all] Error 2
Makefile:162: recipe for target 'all' failed
make: *** [all] Error 2
Thanks for any help


On Tue, Oct 8, 2019 at 2:21 AM Paul bauer  wrote:

> Hej,
>
> I can't access the repository, so I can't say for certain what happened.
> Can you share your cmake command line?
>
> Cheers
>
> Paul
>
> On 07/10/2019 21:25, Lyudmyla Dorosh wrote:
> > Hello Gromacs Developers/Users,
> >
> > I'm trying to install Gromacs-2019.3 on Intel Xeon W-2175 with Intel
> > compilers (+MKL+MPI).
> > First I compiled cmake with Intel compilers. All output files are
> attached.
> > cmake, make seemed to go ok, but all check test failed. What do I do
> wrong?
> >
> https://drive.google.com/file/d/1M8aOaq7ocmK4UOAzcRb5AqWMihIhVtmn/view?usp=sharing
> >
> > Thank you,
> >
> > Lyudmyla Dorosh, PhD
> > 
> > University of Alberta
> > Department of Electrical and Computer Engineering,
> > 4-021 ECERF
> > Edmonton, AB, T6G 2G8
> > Canada
> > Email: dor...@ualberta.ca
> >
>
> --
> Paul Bauer, PhD
> GROMACS Release Manager
> KTH Stockholm, SciLifeLab
> 0046737308594
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>


-- 
Regards,

Lyudmyla Dorosh, PhD

University of Alberta
Department of Electrical and Computer Engineering,
4-021 ECERF
Edmonton, AB, T6G 2G8
Canada
Email: dor...@ualberta.ca
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] �ظ�: �ظ�: Re: Implicit Solvent and Box Size

2019-10-08 Thread John Whittaker
Hi,

As Justin mentioned, with implicit solvent, there is no "box". The solute
is simply diffusing through space. The idea of a simulation box is only
relevant when you are using PBC.

Despite having box vectors present at the bottom of your .gro file, these
don't really have a meaning if you're not using PBC. Similarly, if you
visualize the box using VMD, you will see it, but again it doesn't have a
physical meaning in the simulation.

With regards to your previous email: you mentioned that the molecules
"flew" out of the box. Did it seem as though the molecule was slowly
diffusing through space or did it seem like your system was exploding?

- John


> Dear Justin,
>
> I made the box with 'gmx ediconf'. Should I add some constraints for the
> box boundary for non-pbc box? Would you please give more hints? Thank you
> very much!
>
>
> Regards,
> Zhuo
>
> 
> ��: �� ׿ 
> ʱ��: 2019��10��8�� 10:40
> �ռ���: gromacs.org_gmx-users@maillist.sys.kth.se
> 
> : �ظ�: Re: [gmx-users] Implicit Solvent and Box Size
>
> Dear Justin,
>
> I have tried implicit solvent without pbc or barostat. The box was not
> compressed but the molecules flew out of the box. I believe that it was
> because of nopbc box. Would you please give more information on why
> implicit solvent is used without pbc?
>
> Thanks��
>
>
> Regards,
> Zhuo
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send
> a mail to gmx-users-requ...@gromacs.org.


-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

[gmx-users] 回复: 回复: Re: Implicit Solvent and Box Size

2019-10-08 Thread Zhang Zhuo
Dear Justin,

I made the box with 'gmx ediconf'. Should I add some constraints for the box 
boundary for non-pbc box? Would you please give more hints? Thank you very much!


Regards,
Zhuo


发件人: 张 卓 
发送时间: 2019年10月8日 10:40
收件人: gromacs.org_gmx-users@maillist.sys.kth.se 

主题: 回复: Re: [gmx-users] Implicit Solvent and Box Size

Dear Justin,

I have tried implicit solvent without pbc or barostat. The box was not 
compressed but the molecules flew out of the box. I believe that it was because 
of nopbc box. Would you please give more information on why implicit solvent is 
used without pbc?

Thanks!


Regards,
Zhuo
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

[gmx-users] INFLATEGRO SHRINKING SCRIPT ERROR

2019-10-08 Thread Yogesh Sharma
hello everyone,

I have been struggling with the automated script for shrinking iteration in
Professor lemkul tutorials. can anyone spot what am I doing wrong here?

















































































































* sh run_inflategro.sh  Command line:  gmx grompp -f
minim_inflategro.mdp -c system_inflated.gro -p topol.top -r
system_inflated.gro -o system_inflated_em.tprNOTE 1 [file
minim_inflategro.mdp]:  With Verlet lists the optimal nstlist is >= 10,
with GPUs >= 20. Note  that with the Verlet scheme, nstlist has no effect
on the accuracy of  your simulation.Setting the LD random seed to
2088656998Generated 1038 of the 3486 non-bonded parameter
combinationsExcluding 3 bonded neighbours molecule type 'Protein'Excluding
3 bonded neighbours molecule type 'POPE'Excluding 3 bonded neighbours
molecule type 'U1NL'NOTE 2 [file topol.top, line 16988]:  System has
non-zero total charge: 3.98  Total charge should normally be an
integer. See
http://www.gromacs.org/Documentation/Floating_Point_Arithmetic
  for
discussion on how close it should be to an integer.  Removing all charge
groups because cutoff-scheme=VerletAnalysing residue names:There are:   271
   Protein residuesThere are:   333  Other residuesAnalysing
Protein...Analysing residues not classified as Protein/DNA/RNA/Water and
splitting into groups...Number of degrees of freedom in T-Coupling group
rest is 59709.00NOTE 3 [file minim_inflategro.mdp]:  You are using a plain
Coulomb cut-off, which might produce artifacts.  You might want to consider
using PME electrostatics.Command line:  gmx mdrun -deffnm
system_inflated_emReading file system_inflated_em.tpr, VERSION 5.1.2
(single precision)Using 1 MPI threadUsing 12 OpenMP threads Steepest
Descents:   Tolerance (Fmax)   =  1.0e+03   Number of steps=
 5Energy minimization has stopped, but the forces have not converged to
therequested precision Fmax < 1000 (which may not be possible for your
system).It stopped because the algorithm tried to make a new step whose
size was toosmall, or there was no change in the energy since last step.
Either way, weregard the minimization as converged to within the available
machineprecision, given your starting configuration and EM
parameters.Double precision normally gives you higher accuracy, but this is
often notneeded for preparing to run molecular dynamics.writing lowest
energy coordinates.Steepest Descents converged to machine precision in 174
steps,but did not reach the requested Fmax < 1000.Potential Energy  =
-9.3649824e+07Maximum force =  1.9553424e+14 on atom 17983Norm of force
=  1.9600521e+12NOTE: 11 % of the run time was spent in pair search,
  you might want to increase nstlist (this has no effect on
accuracy)Command line:  gmx trjconv -s system_inflated_em.tpr -f
system_inflated_em.gro -o tmp.gro -pbc molWill write gro: Coordinate file
in Gromos-87 formatReading file system_inflated_em.tpr, VERSION 5.1.2
(single precision)Reading file system_inflated_em.tpr, VERSION 5.1.2
(single precision)Select group for outputGroup 0 ( System) has
19904 elementsGroup 1 (Protein) has  2588 elementsGroup 2 (
 Protein-H) has  2034 elementsGroup 3 (C-alpha) has   271
elementsGroup 4 (   Backbone) has   813 elementsGroup 5 (
 MainChain) has  1085 elementsGroup 6 (   MainChain+Cb) has  1332
elementsGroup 7 (MainChain+H) has  1343 elementsGroup 8 (
 SideChain) has  1245 elementsGroup 9 (SideChain-H) has   949
elementsGroup10 (Prot-Masses) has  2588 elementsGroup11 (
 non-Protein) has 17316 elementsGroup12 (  Other) has 17316
elementsGroup13 (   POPE) has 17316 elementsSelect a group:
Selected 0: 'System'Reading frames from gro file 'U1NL', 19904
atoms.Reading frame   0 time0.000   Precision of
system_inflated_em.gro is 0.001 (nm)Using output precision of 0.001
(nm)Last frame  0 time0.000   gcq#339: "Look at these, my
work-strong arms" (P.J. Harvey)##
RUNNING SHRINKING ITERATION
{1..26}...#run_inflategro.sh: 33:
run_inflategro.sh: Illegal number: {1..26}*

if I have to do it manually without script, should this be the order of
commands including minimization?

perl inflategropl system_inflated_em.gro 0.95 POPE 0
system_shrink1.gro 5 area_shrink1.dat
gmx grompp -f minim_inflategro.mdp -c system_inflated.gro -p topol.top
-r system_inflated.gro -o system_inflated_em.tpr

gmx mdrun -deffnm system_inflated_em

gmx trjconv -s system_inflated_em.tpr -f system_inflated_em.gro -o
tmp.gro -pbc mol
mv tmp.gro system_inflated_em.gro

perl inflategropl system_inflated_em.gro 0.95 POPE 0
system_shrink1.gro 5 area_shrink1.dat...

...x 26
-- 
Gromacs Users mailing list

* Please search the archive at 

Re: [gmx-users] [Performance] poor performance with NV V100

2019-10-08 Thread Szilárd Páll
Hi,

Can you please share your log files? we may be able to help with spotting
performance issues or bottlenecks.
However, note that for NVIDIA are the best source to aid you with
reproducing their benchmark numbers, we

Scaling across multiple GPUs requires some tuning of command line options,
please see the related discussion on the list ((briefly: use multiple ranks
per GPU, and one separate PME rank with GPU offload).

Also note that intra-node strong scaling optimization target of recent
releases (there are no p2p optimizations either), however new features
going into the 2020 release will improve things significantly. Keep an eye
out on the beta2/3 releases if you are interested in checking out the new
features.

Cheers,
--
Szilárd


On Mon, Oct 7, 2019 at 7:48 AM Jimmy Chen  wrote:

> Hi,
>
> I'm using NV v100 to evaluate if it's suitable to do purchase.
> But I can't get similar test result as referenced performance data
> which was got from internet.
> https://developer.nvidia.com/hpc-application-performance
>
> https://www.hpc.co.jp/images/pdf/benchmark/Molecular-Dynamics-March-2018.pdf
>
>
> No matter using docker tag 18.02 from
> https://ngc.nvidia.com/catalog/containers/hpc:gromacs/tags
>
> or gromacs source code from
> ftp://ftp.gromacs.org/pub/gromacs/gromacs-2019.3.tar.gz
>
> test data set is ADH dodec and water 1.5M
> gmx grompp -f pme_verlet.mdp
> gmx mdrun -ntmpi 1 -nb gpu -pin on -v -noconfout -nsteps 5000 -s topol.tpr
> -ntomp 4
> and  gmx mdrun -ntmpi 2 -nb gpu -pin on -v -noconfout -nsteps 5000 -s
> topol.tpr -ntomp 4
>
> My CPU is Intel(R) Xeon(R) Gold 6142 CPU @ 2.60GHz
> and GPU is NV V100 16GB PCIE.
>
> For ADH dodec,
> The perf data of 2xV100 16GB PCIE in
> https://developer.nvidia.com/hpc-application-performance is 176 (ns/day).
> But I only can get 28 (ns/day). actually I can get 67(ns/day) with 1xV100.
> I don't know why I got poorer result with 2xV100.
>
> For water 1.5M
> The perf data of 1xV100 16GB PCIE in
>
> https://www.hpc.co.jp/images/pdf/benchmark/Molecular-Dynamics-March-2018.pdf
> is
> 9.83(ns/day) and 2xV100 is 10.41(ns/day).
> But what I got is 6.5(ns/day) with 1xV100 and 2(ns/day) with 2xV100.
>
> Could anyone give me some suggestions about how to clarify what's problem
> to result to this perf data in my environment? Is my command to perform the
> testing wrong? any suggested command to perform the testing?
> or which source code version is recommended to use now?
>
> btw, after checking the code, it seems MPI doesn't go through PCIE P2p or
> RDMA, is it correct? any plan to implement this in MPI?
>
> Best regards,
> Jimmy
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] ERROR :: Atomtype LC3 not found : KALP-15 in water tutorial

2019-10-08 Thread Justin Lemkul




On 10/8/19 8:44 AM, Seketoulie Keretsu wrote:

Dear Expert,

While doing the KALP15 in DPPC tutorial, I came across this error. I have
tried google search for solutions but couldn't resolve it. Would appreciate
if you would kindly let me know how to move forward:
INPUT COMMAND: gmx grompp -f minim_inflategro.mdp -c system_inflated.gro -p
topol.top -r system_inflated.gro -o system_inflated_em.tpr

OUTPUT ERROR:
ERROR 1 [file dppc.itp, line 7]:
   Atomtype LC3 not found


You haven't properly constructed the force field files. Go back to 
http://www.mdtutorials.com/gmx/membrane_protein/02_topology.html and 
work through each step again.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] ERROR :: Atomtype LC3 not found : KALP-15 in water tutorial

2019-10-08 Thread Seketoulie Keretsu
Dear Expert,

While doing the KALP15 in DPPC tutorial, I came across this error. I have
tried google search for solutions but couldn't resolve it. Would appreciate
if you would kindly let me know how to move forward:
INPUT COMMAND: gmx grompp -f minim_inflategro.mdp -c system_inflated.gro -p
topol.top -r system_inflated.gro -o system_inflated_em.tpr

OUTPUT ERROR:
ERROR 1 [file dppc.itp, line 7]:
  Atomtype LC3 not found

Full error message:

 gmx grompp -f minim_inflategro.mdp -c system_inflated.gro -p topol.top -r
system_inflated.gro -o system_inflated_em.tpr


NOTE 1 [file minim_inflategro.mdp]:
  With Verlet lists the optimal nstlist is >= 10, with GPUs >= 20. Note
  that with the Verlet scheme, nstlist has no effect on the accuracy of
  your simulation.

Setting the LD random seed to -474520607
Generated 165 of the 1596 non-bonded parameter combinations

ERROR 1 [file dppc.itp, line 7]:
  Atomtype LC3 not found


There was 1 note

---
Program: gmx grompp, version 2018.4
Source file: src/gromacs/gmxpreprocess/toppush.cpp (line 1390)

Fatal error:
There was 1 error in input file(s)

Thank you - Seke
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] 回复: Re: Implicit Solvent and Box Size

2019-10-08 Thread Justin Lemkul



On 10/7/19 10:40 PM, 张 卓 wrote:

Dear Justin,

I have tried implicit solvent without pbc or barostat. The box was not 
compressed but the molecules flew out of the box. I believe that it was because 
of nopbc box. Would you please give more information on why implicit solvent is 
used without pbc?


Because there's no need for it. In explicit solvent simulations, we use 
PME for long-range electrostatics and that requires PBC. In implicit 
solvent, there's no need (this is also a reason why implicit simulations 
have intrinsic accuracy issues).


Beyond that, regarding your observations, there is no box. The molecule 
is simply diffusing through space, which will happen in any simulation. 
In a simulation with PBC, this has the effect of the molecule crossing 
back and forth over borders ("jumping"), which simply doesn't happen in 
your case.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Re: [gmx-users] Protein ligand simulation topology

2019-10-08 Thread Justin Lemkul




On 10/8/19 7:30 AM, DEEPANSHU SINGLA wrote:

Respected sir/mam,

While doing the simulation of my protein with a ligand I received the
following error:

Fatal Error:
No line with moleculetype 'SOL' found in the [molecules] section of the
'topol.top'

Molecule section of my topology file is as follows:
  [molecules]
; Compound#mols
Protein_chain_A 1
AB1 1
SOL 17416

Also the number of SOL molecules in the topology file do not match with the
ones shown in the terminal for selecting the continous group of solvent. In
the terminal number of solvent molecules is 52248.

Please help me resolve this issue.


You probably have Windows-style line endings. Use dos2unix to fix the 
.top file and be sure to always use a plain-text editor.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Protein ligand simulation topology

2019-10-08 Thread DEEPANSHU SINGLA
Respected sir/mam,

While doing the simulation of my protein with a ligand I received the
following error:

Fatal Error:
No line with moleculetype 'SOL' found in the [molecules] section of the
'topol.top'

Molecule section of my topology file is as follows:
 [molecules]
; Compound#mols
Protein_chain_A 1
AB1 1
SOL 17416

Also the number of SOL molecules in the topology file do not match with the
ones shown in the terminal for selecting the continous group of solvent. In
the terminal number of solvent molecules is 52248.

Please help me resolve this issue.

Thanking you in advance.

Sincerely
Deepanshu SIngla
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Need help with installation of Gromacs-2019.3 with Intell compilers

2019-10-08 Thread Paul bauer

Hej,

I can't access the repository, so I can't say for certain what happened.
Can you share your cmake command line?

Cheers

Paul

On 07/10/2019 21:25, Lyudmyla Dorosh wrote:

Hello Gromacs Developers/Users,

I'm trying to install Gromacs-2019.3 on Intel Xeon W-2175 with Intel
compilers (+MKL+MPI).
First I compiled cmake with Intel compilers. All output files are attached.
cmake, make seemed to go ok, but all check test failed. What do I do wrong?
https://drive.google.com/file/d/1M8aOaq7ocmK4UOAzcRb5AqWMihIhVtmn/view?usp=sharing

Thank you,

Lyudmyla Dorosh, PhD

University of Alberta
Department of Electrical and Computer Engineering,
4-021 ECERF
Edmonton, AB, T6G 2G8
Canada
Email: dor...@ualberta.ca



--
Paul Bauer, PhD
GROMACS Release Manager
KTH Stockholm, SciLifeLab
0046737308594

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.