I've commented the autodect lines out and am waiting for a compute node.

Here is what you wanted.

Thanks,

On Thu, May 14, 2015 at 11:27 AM, Satish Balay <[email protected]> wrote:

> You mention multiple failures here - but include log for only one case.
>
> Ideally lanugage compilers (c,c++,fortran) should interoperate with
> each other - but without it - one has to know the 'compatibility
> libraries' to use.
>
> -lstdc++ is a gnu/c++ library. For PGI - the library names could be
>  diffferent.
>
> --with-cxxlib-autodetect=1 - which is configure default - tries to
> determine this list - but with cray modules - this causes multiple
> listing of all libraries that are setup via modules - sometimes
> causing grief - hence you are using --with-cxxlib-autodetect=0 on cray
> builds.
>
> Can you send me the following info?
>
> cd src/benchmarks
> cc -v sizeof.c
> CC -v sizeof.c
>
> Also you can try --with-cxxlib-autodetect=1 - and see if it works.
>
> Satish
>
> On Thu, 14 May 2015, Mark Adams wrote:
>
> > We have Titan working but we have a problem that we need lstdc++ to get
> > PETSc to configure.  Otherwise it fails to build fortran.  This seems to
> > only be needed for GNU.  This is for PGI.  If we remove this C++ works,
> > otherwise there is a missing external.  I guess I can just deploy two
> > version, on for fortran and one for C++ (I have a C++ and a fortran code
> > that I support).
> >
> >
> > On Tue, May 12, 2015 at 9:36 AM, Barry Smith <[email protected]> wrote:
> >
> > >
> > >   Remind me again how much the US tax payers spend on this machine each
> > > year.
> > >
> > >
> > > Begin forwarded message:
> > >
> > > *From: *Mark Adams <[email protected]>
> > > *Subject: **Re: [petsc-users] configure error on Titan with Intel*
> > > *Date: *May 12, 2015 at 11:12:55 AM CDT
> > > *To: *Barry Smith <[email protected]>
> > > *Cc: *Jed Brown <[email protected]>, petsc-users <
> [email protected]>,
> > > "David Trebotich" <[email protected]>, "D'Azevedo, Ed F." <
> [email protected]>
> > >
> > >
> > >
> > >>   Notes: ./configure ran fine and detected -lhwloc in some standard
> > >> system install location under normal circumstances it couldn't just
> > >> disappear for a different example.
> > >>
> > >>
> > > I configured in an interactive shell, so on a compute node.  I tried to
> > > 'make ex56' on a login node, as usual.  So I am guessing that it would
> have
> > > worked if I had made it a compute node.  They have an inconstancy, I'm
> > > guessing. I can try it all on a compute node ....
> > >
> > > Mark
> > >
> > >
> > >>
> > >> > /
> > >> >
> > >> > On Fri, May 8, 2015 at 5:51 PM, Jed Brown <[email protected]> wrote:
> > >> > Satish Balay <[email protected]> writes:
> > >> >
> > >> > > Also - Perhaps with intel compilers - the recommended blas is
> > >> > > something other than ACML? [like MKL?]. Something to check..
> > >> >
> > >> > MKL (and anything built using the Intel compiler) has the "run
> slower on
> > >> > AMD" feature.  For any given operation, ACML won't necessarily be
> > >> > faster, but at least it's not intentionally crippled.  For most of
> what
> > >> > PETSc does, any such MKL/ACML difference is irrelevant.
> > >> >
> > >> > <configure.log><make.log>
> > >>
> > >>
> > >
> > >
> >
>
>

Attachment: out
Description: Binary data

Reply via email to