Re: [gmx-users] Amide hydrogen naming in charmm 36 forcefield

2019-07-19 Thread Dilip.H.N
Sir,
I have also tried with atom name H, (i have checked in the .rtp file and is
named as 'H'), but still i am getting the same following error:

Atom HN in residue NMA 1 was not found in rtp entry NMA with 12 atoms
while sorting atoms.
First, i tried with naming the amide hydrogens as 'H' and got the
above-mentioned error and then i followed the link
https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2014-May/088942.html
which
stated "CHARMM has unique nomenclature for amide H atoms, so H needs to be
renamed HN, though unfortunately, that's also wrong in the .rtp file. I'll
fix that for the future." and hence i changed the 'H' into 'HN'.

In spite in both the cases (naming the amide hydrogens as 'H' or 'HN'), i
am getting the errors.

The amide nitrogens are not defined in the .hdb files in charmm36 FF too.

Any suggestions...?
Thank you.
---
With Best Regards,

Dilip.H.N
Ph.D. Student.


[image: Mailtrack]

Sender
notified by
Mailtrack

20/07/19,
10:34:20

On Sat, Jul 20, 2019 at 5:30 AM Justin Lemkul  wrote:

>
>
> On 7/19/19 9:26 AM, Dilip.H.N wrote:
> > Hello,
> > I tried to get the topology for the N-methyl acetamide (NMA) through gmx
> > pdb2gmx command in charmm 36 FF,
> > but the following error states:
> >
> > Atom HN in residue NMA 1 was not found in rtp entry NMA with 12 atoms
> > while sorting atoms.
> > For a hydrogen, this can be a different protonation state, or it
> > might have had a different number in the PDB file and was rebuilt
> > (it might for instance have been H3, and we only expected H1 & H2).
> > Note that hydrogens might have been added to the entry for the
> N-terminus.
> > Remove this hydrogen or choose a different protonation state to solve it.
> > Option -ignh will ignore all hydrogens in the input.
> >
> > When searched i found the following link
> >
> https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2014-May/088942.html
> >
> > which states that "CHARMM has unique nomenclature for amide H
> > atoms,.." as stated by Dr. Justin lemkul. Even in the .hdb file,
> > the naming has not been addressed.
> > I have tried renaming the amide hydrogen into 'HN' from 'H' but still
> gives
> > the error.  Why is this amide hydrogen unnamed in Charmm 36 FF...?
> >
> > So, May i know has this issue been resolved?
>
> The atom name is H. Check the NMA entry in the .rtp file and you will
> see this.
>
> -Justin
>
> --
> ==
>
> Justin A. Lemkul, Ph.D.
> Assistant Professor
> Office: 301 Fralin Hall
> Lab: 303 Engel Hall
>
> Virginia Tech Department of Biochemistry
> 340 West Campus Dr.
> Blacksburg, VA 24061
>
> jalem...@vt.edu | (540) 231-3129
> http://www.thelemkullab.com
>
> ==
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-requ...@gromacs.org.
>
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


Re: [gmx-users] Drude force field

2019-07-19 Thread Myunggi Yi
Dear Justin,

I've followed the instructions, and I prepared my system.

I've got the following error message.

" No default Bond types."

There are several errors.

The bond is between atoms 43 and 45, and the following is a part of the
dmpc.itp, which is generated by pdb2gmx with drude-2013f_2018_10.ff

43 OD30CL  1   DMPCO21 43 1.484715.5994   ;
qtot 1.803
44   DRUD  1   DMPC   DO21 44-1.4847  0   ;
qtot 0.318
45LPD  1   DMPC   LPMA 45  -0.17  0   ;
qtot 0.148
46LPD  1   DMPC   LPMB 46  -0.17  0   ;
qtot -0.022
47 CD2O3B  1   DMPCC21 472.72816 11.611   ;
qtot 2.706
48   DRUD  1   DMPC   DC21 48   -2.03116  0   ;
qtot 0.675

It seems gromacs added a bond to them, which is not available in the
dmpc.itp file.
Is it common the bond between polar heavy atom and the lone pair virtual
sites?

Program information:
GROMACS:  gmx grompp, version 2016-dev-20181220-a2cab74

Command line:
  gmx grompp -f em.mdp -c ../gg.gro -r ../gg.gro -p ../topol.top -o em.tpr


In addition, I do have not integer total charge.

NOTE 2 [file topol.top, line 49]:
  System has non-zero total charge: -0.000464

Of course, when I generate the protein itp, the total charge was 0.00.
As you know DMPC is neutral too.
After combine protein, dmpc, water, this happened.

Any suggestion?

Thank you.


On Thu, Jul 18, 2019 at 11:22 AM Justin Lemkul  wrote:

> On Wed, Jul 17, 2019 at 8:58 PM Myunggi Yi  wrote:
>
> > Thank you Dr. Lemkel,
> >
> > I don't have ions in my simulation. It's a neutral system with a protein
> in
> > membrane bilayer with solvent.
> > I have downloaded the force field (Drude FF for charmm FF in Gromacs
> > format). to run the simulation with charmm FF in "Gromacs 2019.3".
> > However, it seems the format of the file does not match with the current
> > version.
> >
> > In the web,
> >
> > Compile and install as you would any other (post-5.0) GROMACS version. If
> > you attempt to use *ANY OTHER VERSION OF GROMACS, the Drude features will
> > not be accessible.*
> >
> > There are 5.0 and 5.1 series of Gromacs versions. Which one should I use?
> >
> > Or, it there a way to modify the force field format to use the current
> > version of Gromacs?, Then I will modify the format.
> >
> >
> Read the information at the previous link more carefully. You cannot use
> any released version of GROMACS. You must use the developmental version as
> instructed in that link.
>
> -Justin
>
>
> >
> > On Thu, Jul 18, 2019 at 9:43 AM Justin Lemkul  wrote:
> >
> > >
> > >
> > > On 7/17/19 8:39 PM, Myunggi Yi wrote:
> > > > Dear users,
> > > >
> > > > I want to run a simulation with a polarizable force field.
> > > >
> > > > How and where can I get Drude force field for the current version of
> > > > Gromacs?
> > >
> > > Everything you need to know:
> > >
> > > http://mackerell.umaryland.edu/charmm_drude_ff.shtml
> > >
> > > The implementation is not complete. If your system has ions, do not use
> > > GROMACS due to the lack of NBTHOLE. In that case, use NAMD, CHARMM, or
> > > OpenMM. The Drude model is still considered experimental, hence it is
> > > not officially supported yet. There have been a lot of snags along the
> > > way (mostly in my time to get the code up to par for official
> inclusion).
> > >
> > > -Justin
> > >
> > > --
> > > ==
> > >
> > > Justin A. Lemkul, Ph.D.
> > > Assistant Professor
> > > Office: 301 Fralin Hall
> > > Lab: 303 Engel Hall
> > >
> > > Virginia Tech Department of Biochemistry
> > > 340 West Campus Dr.
> > > Blacksburg, VA 24061
> > >
> > > jalem...@vt.edu | (540) 231-3129
> > > http://www.thelemkullab.com
> > >
> > > ==
> > >
> > > --
> > > Gromacs Users mailing list
> > >
> > > * Please search the archive at
> > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > > posting!
> > >
> > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > >
> > > * For (un)subscribe requests visit
> > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > > send a mail to gmx-users-requ...@gromacs.org.
> > >
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-requ...@gromacs.org.
> >
> --
>
> ==
>
> Justin A. Lemkul, Ph.D.
>
> Assistant Professor
>
> Office: 301 Fralin Hall
> Lab: 303 Engel Hall
>
> Virginia Tech Department of Biochemistry
> 340 West Campus Dr.
> Blacksburg, VA 24061
>
> jalem...@vt.edu | (540) 231-3129
> http://www.thelemku

Re: [gmx-users] Amide hydrogen naming in charmm 36 forcefield

2019-07-19 Thread Justin Lemkul




On 7/19/19 9:26 AM, Dilip.H.N wrote:

Hello,
I tried to get the topology for the N-methyl acetamide (NMA) through gmx
pdb2gmx command in charmm 36 FF,
but the following error states:

Atom HN in residue NMA 1 was not found in rtp entry NMA with 12 atoms
while sorting atoms.
For a hydrogen, this can be a different protonation state, or it
might have had a different number in the PDB file and was rebuilt
(it might for instance have been H3, and we only expected H1 & H2).
Note that hydrogens might have been added to the entry for the N-terminus.
Remove this hydrogen or choose a different protonation state to solve it.
Option -ignh will ignore all hydrogens in the input.

When searched i found the following link
https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2014-May/088942.html

which states that "CHARMM has unique nomenclature for amide H
atoms,.." as stated by Dr. Justin lemkul. Even in the .hdb file,
the naming has not been addressed.
I have tried renaming the amide hydrogen into 'HN' from 'H' but still gives
the error.  Why is this amide hydrogen unnamed in Charmm 36 FF...?

So, May i know has this issue been resolved?


The atom name is H. Check the NMA entry in the .rtp file and you will 
see this.


-Justin

--
==

Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall

Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061

jalem...@vt.edu | (540) 231-3129
http://www.thelemkullab.com

==

--
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] New ACPYPE server

2019-07-19 Thread Alan
Dear community,

Since the original server went down due to hardware obsolescence, we are
now happy to announce that finally the new ACPYPE server is up and running
in a new site hosted at http://bio2byte.be/

http://bio2byte.com/acpype/

If you used it the past, then you'll need to register again.

If don't know what ACPYPE is but you're a looking for a way to speed up the
process of generating topologies for unusual small molecules like
inhibitors, ligands or drugs? Then check it out.

If having issues, please don't hesitate to let us know.

Thanks!

Alan, Luciano Kagami & Wim Vranken
---
Alan Silva, DSc
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] Amide hydrogen naming in charmm 36 forcefield

2019-07-19 Thread Dilip.H.N
Hello,
I tried to get the topology for the N-methyl acetamide (NMA) through gmx
pdb2gmx command in charmm 36 FF,
but the following error states:

Atom HN in residue NMA 1 was not found in rtp entry NMA with 12 atoms
while sorting atoms.
For a hydrogen, this can be a different protonation state, or it
might have had a different number in the PDB file and was rebuilt
(it might for instance have been H3, and we only expected H1 & H2).
Note that hydrogens might have been added to the entry for the N-terminus.
Remove this hydrogen or choose a different protonation state to solve it.
Option -ignh will ignore all hydrogens in the input.

When searched i found the following link
https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2014-May/088942.html

which states that "CHARMM has unique nomenclature for amide H
atoms,.." as stated by Dr. Justin lemkul. Even in the .hdb file,
the naming has not been addressed.
I have tried renaming the amide hydrogen into 'HN' from 'H' but still gives
the error.  Why is this amide hydrogen unnamed in Charmm 36 FF...?

So, May i know has this issue been resolved?

How can i solve this problem?

Any suggestions are appreciated.
Thank you.
---
With Best Regards,

Dilip.H.N
Ph.D. Student.


[image: Mailtrack]

Sender
notified by
Mailtrack

19/07/19,
18:49:04
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] performance issues running gromacs with more than 1 gpu card in slurm

2019-07-19 Thread Carlos Navarro
Dear gmx-users,
I’m currently working in a server where each node posses 40 physical cores
(40 threads) and 4 Nvidia-V100.
When I launch a single job (1 simulation using a single gpu card) I get a
performance of about ~35ns/day in a system of about 300k atoms. Looking
into the usage of the video card during the simulation I notice that the
card is being used about and ~80%.
The problems arise when I increase the number of jobs running at the same
time. If for instance 2 jobs are running at the same time, the performance
drops to ~25ns/day each and the usage of the video cards also drops during
the simulation to about a ~30-40% (and sometimes dropping to less than 5%).
Clearly there is a communication problem between the gpu cards and the cpu
during the simulations, but I don’t know how to solve this.
Here is the script I use to run the simulations:

#!/bin/bash -x
#SBATCH --job-name=testAtTPC1
#SBATCH --ntasks-per-node=4
#SBATCH --cpus-per-task=20
#SBATCH --account=hdd22
#SBATCH --nodes=1
#SBATCH --mem=0
#SBATCH --output=sout.%j
#SBATCH --error=s4err.%j
#SBATCH --time=00:10:00
#SBATCH --partition=develgpus
#SBATCH --gres=gpu:4

module use /gpfs/software/juwels/otherstages
module load Stages/2018b
module load Intel/2019.0.117-GCC-7.3.0
module load IntelMPI/2019.0.117
module load GROMACS/2018.3

WORKDIR1=/p/project/chdd22/gromacs/benchmark/AtTPC1/singlegpu/1
WORKDIR2=/p/project/chdd22/gromacs/benchmark/AtTPC1/singlegpu/2
WORKDIR3=/p/project/chdd22/gromacs/benchmark/AtTPC1/singlegpu/3
WORKDIR4=/p/project/chdd22/gromacs/benchmark/AtTPC1/singlegpu/4

DO_PARALLEL=" srun --exclusive -n 1 --gres=gpu:1 "
EXE=" gmx mdrun "

cd $WORKDIR1
$DO_PARALLEL $EXE -s eq6.tpr -deffnm eq6-20 -nmpi 1 -pin on -pinoffset 0
-ntomp 20 &>log &
cd $WORKDIR2
$DO_PARALLEL $EXE -s eq6.tpr -deffnm eq6-20 -nmpi 1 -pin on -pinoffset 10
-ntomp 20 &>log &
cd $WORKDIR3
$DO_PARALLEL $EXE -s eq6.tpr -deffnm eq6-20  -nmpi 1 -pin on -pinoffset 20
-ntomp 20 &>log &
cd $WORKDIR4
$DO_PARALLEL $EXE -s eq6.tpr -deffnm eq6-20 -nmpi 1 -pin on -pinoffset 30
-ntomp 20 &>log &


Regarding to pinoffset, I first tried using 20 cores for each job but then
also tried with 8 cores (so pinoffset 0 for job 1, pinoffset 4 for job 2,
pinoffset 8 for job 3 and pinoffset 12 for job) but at the end the problem
persist.

Currently in this machine I’m not able to use more than 1 gpu per job, so
this is my only choice to use properly the whole node.
If you need more information please just let me know.
Best regards.
Carlos

——
Carlos Navarro Retamal
Bioinformatic Engineering. PhD.
Postdoctoral Researcher in Center of Bioinformatics and Molecular
Simulations
Universidad de Talca
Av. Lircay S/N, Talca, Chile
E: carlos.navarr...@gmail.com or cnava...@utalca.cl
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

[gmx-users] inexplicable rmsf error - Too many iterations in routine JACOBI

2019-07-19 Thread Mala L Radhakrishnan
Hi all,

I am trying to run gmx RMSF using.a particular snapshot as my reference
structure (-s option).

When I extract a gro file to make my reference structure that has only the
molecule that INCLUDES the alpha-carbons in my index file that I want to
use for my RMSF calculation (I checked), then midway through its reading in
the frames (and not always at the same point) I get the following error:

Program: gmx rmsf, version 2016.3
Source file: src/gromacs/linearalgebra/nrjac.cpp (line 165)

Fatal error:
Error: Too many iterations in routine JACOBI

When I extract a gro file that has an even larger subset of the atoms in
the frame than the one requested previously, it seg-faults right away.

But when I extract a snapshot that has the *entire* system to use as my
reference structure (all atoms, including solvent and ions), it seems to
work. I am using the same .xtc file and index file in all cases, just
changing the -s file.

Googling the error suggests that I have a mismatch somewhere in my index
file and/or structure file, but (a) I have checked this by eye and can't
find any mismatches and (b) when I do he same sort of extraction of a
subset of the system on several other trajectories it works just fine -- in
other words, this seems to be the only trajectory where I need to extract
the entire system (including solvent, ions, etc.) as my reference frame
just to get an rmsf of alpha carbons in the protein molecule of interest. I
am using version 2016.3.  I have tried multiple gromacs versions to extract
and run just in case but got the same error.  The reference structure also
appears to be structurally fine (no broken molecules, etc.)

What could be different about this trajectory that is causing the problem?
Again, I looked by eye at the gro file I'm using as my reference structure
and made sure that it contains all the atoms in the index file selection
I'm using for my RMSF calculation...

Thanks for any help. I am happy to send you the files for which it works
and for which it doesn't...just very confused!

Mala



-- 
Mala L. Radhakrishnan
Associate Professor of Chemistry
Director, Biochemistry Program
Wellesley College
106 Central Street
Wellesley, MA 02481
(781)283-2981
-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.


[gmx-users] make manual fails

2019-07-19 Thread Michael Brunsteiner
thanks Szilard for the reply!
below is the output of the cmake and make command, i doesn't find sphinx, but 
then i doesn't seem to look for it either ...anyway, on this second attempt I 
get a different error, no idea why, only diff: the first time i first made 
gromacsand only then the manual, this time i made the manual direcctly after 
the cmake command. see below,looks as if my sphinx version (1.8.4-1) misses 
something ... but then i guess this is a minor issue, what i tryto make here is 
an exact copy of the online version, isn't it?mic

prompt> cmake .. -DGMX_BUILD_OWN_FFTW=ON -DCMAKE_C_COMPILER=gcc-7 
-DCMAKE_CXX_COMPILER=g++-7 -DGMX_GPU=on 
-DCMAKE_INSTALL_PREFIX=/home/michael/local/gromacs-2019-3-bin 
-DGMX_BUILD_MANUAL=on-- The C compiler identification is GNU 7.4.0
-- The CXX compiler identification is GNU 7.4.0
-- Check for working C compiler: /usr/bin/gcc-7
-- Check for working C compiler: /usr/bin/gcc-7 -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/g++-7
-- Check for working CXX compiler: /usr/bin/g++-7 -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Looking for pthread_create
-- Looking for pthread_create - not found
-- Looking for pthread_create in pthreads
-- Looking for pthread_create in pthreads - not found
-- Looking for pthread_create in pthread
-- Looking for pthread_create in pthread - found
-- Found Threads: TRUE  
-- Performing Test CXXFLAG_STD_CXX0X
-- Performing Test CXXFLAG_STD_CXX0X - Success
-- Performing Test CXX11_SUPPORTED
-- Performing Test CXX11_SUPPORTED - Success
-- Performing Test CXX11_STDLIB_PRESENT
-- Performing Test CXX11_STDLIB_PRESENT - Success
-- Looking for NVIDIA GPUs present in the system
-- Number of NVIDIA GPUs detected: 1 
-- Found CUDA: /usr/local/cuda (found suitable version "10.0", minimum required 
is "7.0") 
-- Found OpenMP_C: -fopenmp (found version "4.5") 
-- Found OpenMP_CXX: -fopenmp (found version "4.5") 
-- Found OpenMP: TRUE (found version "4.5")  
-- Performing Test CFLAGS_EXCESS_PREC
-- Performing Test CFLAGS_EXCESS_PREC - Success
-- Performing Test CFLAGS_COPT
-- Performing Test CFLAGS_COPT - Success
-- Performing Test CFLAGS_NOINLINE
-- Performing Test CFLAGS_NOINLINE - Success
-- Performing Test CXXFLAGS_EXCESS_PREC
-- Performing Test CXXFLAGS_EXCESS_PREC - Success
-- Performing Test CXXFLAGS_COPT
-- Performing Test CXXFLAGS_COPT - Success
-- Performing Test CXXFLAGS_NOINLINE
-- Performing Test CXXFLAGS_NOINLINE - Success
-- Looking for include file unistd.h
-- Looking for include file unistd.h - found
-- Looking for include file pwd.h
-- Looking for include file pwd.h - found
-- Looking for include file dirent.h
-- Looking for include file dirent.h - found
-- Looking for include file time.h
-- Looking for include file time.h - found
-- Looking for include file sys/time.h
-- Looking for include file sys/time.h - found
-- Looking for include file io.h
-- Looking for include file io.h - not found
-- Looking for include file sched.h
-- Looking for include file sched.h - found
-- Looking for include file regex.h
-- Looking for include file regex.h - found
-- Looking for gettimeofday
-- Looking for gettimeofday - found
-- Looking for sysconf
-- Looking for sysconf - found
-- Looking for nice
-- Looking for nice - found
-- Looking for fsync
-- Looking for fsync - found
-- Looking for _fileno
-- Looking for _fileno - not found
-- Looking for fileno
-- Looking for fileno - found
-- Looking for _commit
-- Looking for _commit - not found
-- Looking for sigaction
-- Looking for sigaction - found
-- Performing Test HAVE_BUILTIN_CLZ
-- Performing Test HAVE_BUILTIN_CLZ - Success
-- Performing Test HAVE_BUILTIN_CLZLL
-- Performing Test HAVE_BUILTIN_CLZLL - Success
-- Looking for clock_gettime in rt
-- Looking for clock_gettime in rt - found
-- Looking for feenableexcept in m
-- Looking for feenableexcept in m - found
-- Looking for fedisableexcept in m
-- Looking for fedisableexcept in m - found
-- Checking for sched.h GNU affinity API
-- Performing Test sched_affinity_compile
-- Performing Test sched_affinity_compile - Success
-- Looking for include file mm_malloc.h
-- Looking for include file mm_malloc.h - found
-- Looking for include file malloc.h
-- Looking for include file malloc.h - found
-- Looking for include file xmmintrin.h
-- Looking for include file xmmintrin.h - found
-- Checking for _mm_malloc()
-- Checking for _mm_malloc() - supported
-- Looking for posix_memalign
-- Looking for posix_memalign - found
-- Looking for memalign
-- Looking for memalign - not found
-- Check if the system is big endian
-- Searching 16 bit integer
-- Looking for sys/types.h
-- Looking for sys/types.h - found
-- Looking for stdint.h
-- Looking