It looks like configure is not finding the correct cc. It does not seem
hard to find.

06:37 master= /lustre/atlas/proj-shared/geo127/petsc$ cc --version
gcc (GCC) 6.3.0 20161221 (Cray Inc.)
Copyright (C) 2016 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

06:37 master= /lustre/atlas/proj-shared/geo127/petsc$ which cc
/opt/cray/craype/2.5.13/bin/cc
06:38 master= /lustre/atlas/proj-shared/geo127/petsc$ which gcc
/opt/gcc/6.3.0/bin/gcc


On Wed, Oct 31, 2018 at 6:34 AM Mark Adams <mfad...@lbl.gov> wrote:

>
>
> On Wed, Oct 31, 2018 at 5:05 AM Karl Rupp <r...@iue.tuwien.ac.at> wrote:
>
>> Hi Mark,
>>
>> please comment or remove lines 83 and 84 in
>>   config/BuildSystem/config/packages/cuda.py
>>
>> Is there a compiler newer than GCC 4.3 available?
>>
>
> You mean 6.3?
>
> 06:33  ~$ module avail gcc
>
> ----------------------------------------------------- /opt/modulefiles
> -----------------------------------------------------
> gcc/4.8.1          gcc/4.9.3          gcc/6.1.0
> gcc/6.3.0(default) gcc/7.2.0
> gcc/4.8.2          gcc/5.3.0          gcc/6.2.0          gcc/7.1.0
>   gcc/7.3.0
>
>
>
>>
>> Best regards,
>> Karli
>>
>>
>>
>> On 10/31/18 8:15 AM, Mark Adams via petsc-dev wrote:
>> > After loading a cuda module ...
>> >
>> > On Wed, Oct 31, 2018 at 2:58 AM Mark Adams <mfad...@lbl.gov
>> > <mailto:mfad...@lbl.gov>> wrote:
>> >
>> >     I get an error with --with-cuda=1
>> >
>> >     On Tue, Oct 30, 2018 at 4:44 PM Smith, Barry F. <bsm...@mcs.anl.gov
>> >     <mailto:bsm...@mcs.anl.gov>> wrote:
>> >
>> >         --with-cudac=1 should be --with-cuda=1
>> >
>> >
>> >
>> >          > On Oct 30, 2018, at 12:35 PM, Smith, Barry F. via petsc-dev
>> >         <petsc-dev@mcs.anl.gov <mailto:petsc-dev@mcs.anl.gov>> wrote:
>> >          >
>> >          >
>> >          >
>> >          >> On Oct 29, 2018, at 8:09 PM, Mark Adams <mfad...@lbl.gov
>> >         <mailto:mfad...@lbl.gov>> wrote:
>> >          >>
>> >          >> And a debug build seems to work:
>> >          >
>> >          >    Well ok.
>> >          >
>> >          >    Are there newer versions of the Gnu compiler for this
>> >         system? Are there any other compilers on the system that would
>> >         likely be less buggy? IBM compilers? If this simple code
>> >         generates a gross error with optimization who's to say how many
>> >         more subtle bugs may be induced in the library by the buggy
>> >         optimizer (there may be none but IMHO probability says there
>> >         will be others).
>> >          >
>> >          >    Is there any chance that valgrind runs on this machine;
>> >         you could run the optimized version through it and see what it
>> says.
>> >          >
>> >          >   Barry
>> >          >
>> >          >>
>> >          >> 21:04 1 master= /lustre/atlas/proj-shared/geo127/petsc$ make
>> >
>>  PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda
>> >         PETSC_ARCH="" test
>> >          >> Running test examples to verify correct installation
>> >          >> Using
>> >
>>  PETSC_DIR=/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda
>> >         and PETSC_ARCH=
>> >          >> *******************Error detected during compile or
>> >         link!*******************
>> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> >          >>
>> >
>>  /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials
>> >         ex19
>> >          >>
>> >
>>  
>> *********************************************************************************
>> >          >> cc -o ex19.o -c -g
>> >
>>  -I/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/include
>>   `pwd`/ex19.c
>> >          >> cc -g  -o ex19 ex19.o
>> >
>>  -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
>> >
>>  
>> -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
>> >
>>  -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
>> >         -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl
>> >          >>
>> >
>>  
>> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o):
>> >         In function `PetscDLOpen':
>> >          >>
>> >
>>  /lustre/atlas1/geo127/proj-shared/petsc/src/sys/dll/dlimpl.c:108: warning:
>> >         Using 'dlopen' in statically linked applications requires at
>> >         runtime the shared libraries from the glibc version used for
>> linking
>> >          >>
>> >
>>  
>> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(send.o):
>> >         In function `PetscOpenSocket':
>> >          >>
>> >
>>  
>> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/classes/viewer/impls/socket/send.c:108:
>> >         warning: Using 'gethostbyname' in statically linked applications
>> >         requires at runtime the shared libraries from the glibc version
>> >         used for linking
>> >          >> true ex19
>> >          >> rm ex19.o
>> >          >> Possible error running C/C++
>> >         src/snes/examples/tutorials/ex19 with 1 MPI process
>> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> >          >> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
>> >          >> Number of SNES iterations = 2
>> >          >> Application 19081049 resources: utime ~1s, stime ~1s, Rss
>> >         ~17112, inblocks ~36504, outblocks ~111043
>> >          >> Possible error running C/C++
>> >         src/snes/examples/tutorials/ex19 with 2 MPI processes
>> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> >          >> lid velocity = 0.0016, prandtl # = 1., grashof # = 1.
>> >          >> Number of SNES iterations = 2
>> >          >> Application 19081050 resources: utime ~1s, stime ~1s, Rss
>> >         ~19816, inblocks ~36527, outblocks ~111043
>> >          >> 5a6
>> >          >>> Application 19081051 resources: utime ~1s, stime ~0s, Rss
>> >         ~13864, inblocks ~36527, outblocks ~111043
>> >          >>
>> >
>>  /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials
>> >          >> Possible problem with ex19_hypre, diffs above
>> >          >> =========================================
>> >          >> *******************Error detected during compile or
>> >         link!*******************
>> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> >          >>
>> >
>>  /lustre/atlas/proj-shared/geo127/petsc/src/snes/examples/tutorials
>> >         ex5f
>> >          >> *********************************************************
>> >          >> ftn -c -g
>> >
>>  -I/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/include
>> >            -o ex5f.o ex5f.F90
>> >          >> ftn -g   -o ex5f ex5f.o
>> >
>>  -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
>> >
>>  
>> -Wl,-rpath,/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
>> >
>>  -L/lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib
>> >         -lpetsc -lHYPRE -lflapack -lfblas -lparmetis -lmetis -ldl
>> >          >>
>> >
>>  
>> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(dlimpl.o):
>> >         In function `PetscDLOpen':
>> >          >>
>> >
>>  /lustre/atlas1/geo127/proj-shared/petsc/src/sys/dll/dlimpl.c:108: warning:
>> >         Using 'dlopen' in statically linked applications requires at
>> >         runtime the shared libraries from the glibc version used for
>> linking
>> >          >>
>> >
>>  
>> /lustre/atlas/proj-shared/geo127/petsc_titan_dbg64idx_gnu_cuda/lib/libpetsc.a(send.o):
>> >         In function `PetscOpenSocket':
>> >          >>
>> >
>>  
>> /lustre/atlas1/geo127/proj-shared/petsc/src/sys/classes/viewer/impls/socket/send.c:108:
>> >         warning: Using 'gethostbyname' in statically linked applications
>> >         requires at runtime the shared libraries from the glibc version
>> >         used for linking
>> >          >> rm ex5f.o
>> >          >> Possible error running Fortran example
>> >         src/snes/examples/tutorials/ex5f with 1 MPI process
>> >          >> See http://www.mcs.anl.gov/petsc/documentation/faq.html
>> >          >> Number of SNES iterations =     4
>> >          >> Application 19081055 resources: utime ~1s, stime ~0s, Rss
>> >         ~12760, inblocks ~36800, outblocks ~111983
>> >          >> Completed test examples
>> >          >> 21:06 master= /lustre/atlas/proj-shared/geo127/petsc$
>> >          >
>> >
>>
>

Reply via email to