On Thu, Mar 22, 2018 at 8:29 PM, 我 wrote:
> Hi all,
> I want to analyze the preconditioned matrix. But the
> KSPComputeExplicitOperator costs too much time to obtain the matrix. My
> origin matrix is about 3000*3000 sparse one. I noticed this function is
> applicable for the
Hi all,
I want to analyze the preconditioned matrix. But the KSPComputeExplicitOperator
costs too much time to obtain the matrix. My origin matrix is about 3000*3000
sparse one. I noticed this function is applicable for the relative small
system. What's the matrix-size limitation for this
Thanks, Jose,
Works fine.
Thanks,
Fande,
On Thu, Mar 22, 2018 at 2:43 AM, Jose E. Roman wrote:
> Fixed
> https://bitbucket.org/slepc/slepc/commits/464bcc967aa18470486aba71868e0a
> e158c3fe49
>
>
> > El 22 mar 2018, a las 3:23, Satish Balay escribió:
>
On Thu, Mar 22, 2018 at 2:01 PM, Smith, Barry F. wrote:
>
>
> > On Mar 22, 2018, at 12:58 PM, Balay, Satish wrote:
> >
> > On Thu, 22 Mar 2018, Smith, Barry F. wrote:
> >
> >>
> >>
> >>> On Mar 22, 2018, at 10:22 AM, Klaij, Christiaan
>
> On Mar 22, 2018, at 12:58 PM, Balay, Satish wrote:
>
> On Thu, 22 Mar 2018, Smith, Barry F. wrote:
>
>>
>>
>>> On Mar 22, 2018, at 10:22 AM, Klaij, Christiaan wrote:
>>>
>>> Well, that's a small burden.
>>>
>>> By the way, the option --with-cc=mpicc
On Thu, 22 Mar 2018, Smith, Barry F. wrote:
>
>
> > On Mar 22, 2018, at 10:22 AM, Klaij, Christiaan wrote:
> >
> > Well, that's a small burden.
> >
> > By the way, the option --with-cc=mpicc together with export
> > I_MPI_CC=icc doesn't work together with --download-hdf5,
> On Mar 22, 2018, at 10:22 AM, Klaij, Christiaan wrote:
>
> Well, that's a small burden.
>
> By the way, the option --with-cc=mpicc together with export
> I_MPI_CC=icc doesn't work together with --download-hdf5, while it
> works --with-cc=mpiicc and no export. Guess the
Well, that's a small burden.
By the way, the option --with-cc=mpicc together with export
I_MPI_CC=icc doesn't work together with --download-hdf5, while it
works --with-cc=mpiicc and no export. Guess the export doesn't
make it into the hdf5 configure.
Any plans for adding --download-cgns and have
Should mention:
one nice thing about --with-mpi-dir option is - it attempts to pick up
related/compatible compilers - i.e MPI_DIR/bin/{mpicc,mpicxx,mpif90}
When one uses --with-cc=mpicc --with-fc=mpif90 --with-cxx=mpicxx - the
burden on providing related/compatible compilers is on the user - i.e
Guess I shouldn't have read the intel manual :-)
Lesson learned, no more guessing. And I sure appreciate all the
effort you are putting into this, especially the option to
configure and build other packages in a consistent way with the
--download-package option.
Chris
dr. ir. Christiaan Klaij
Sure - they have so many ways of tweaking things - so one can't things
will be the same across multiple users.
Most users dont' set these env variables [thats extra work - and not
the default behavior anyway] - so they rely on the default behavior -
i.e use mpicc for gnu and mpiicc for intel
Fair enough.
As a side note: if you want intel compilers, there's no need to
specify mpiicc, intel says to export I_MPI_CC=icc, which gives
$ which mpicc
/opt/intel/compilers_and_libraries_2017.1.132/linux/mpi/intel64/bin/mpicc
$ mpicc -v
mpiicc for the Intel(R) MPI Library 2017 Update 1 for
On Thu, 22 Mar 2018, Klaij, Christiaan wrote:
> OK, configure works with the option --with-cc=/path/to/mpicc,
> thanks!
>
> I'm not sure you are right about checking --with-mpi-dir/bin,
There are 2 types of options to configure:
- tell configure what to use - i.e do not guess [like
There's a bin64 symlink:
$ ls -l /opt/intel/compilers_and_libraries_2017.1.132/linux/mpi
total 0
drwxr-xr-x. 3 root root 16 Mar 28 2017 benchmarks
lrwxrwxrwx. 1 root root 11 Mar 28 2017 bin64 -> intel64/bin
drwxr-xr-x. 2 root root 41 Mar 28 2017 binding
lrwxrwxrwx. 1 root root 11 Mar 28 2017
OK, configure works with the option --with-cc=/path/to/mpicc,
thanks!
I'm not sure you are right about checking --with-mpi-dir/bin,
because, at least with this install, the binaries are located in
--with-mpi-dir/intel64/bin:
$ which mpicc
Added this change to balay/mpi-dir-check-warning/maint and merged to next.
Satish
On Thu, 22 Mar 2018, Satish Balay wrote:
> The relevant change that is causing the difference is:
>
> https://bitbucket.org/petsc/petsc/commits/a98758b74fbc47f3dca87b526141d347301fd5eb
>
> Perhaps we should
The relevant change that is causing the difference is:
https://bitbucket.org/petsc/petsc/commits/a98758b74fbc47f3dca87b526141d347301fd5eb
Perhaps we should print a warning if with-mpi-dir/bin is missing.
$ ./configure --with-mpi-dir=$HOME/tmp
On Thu, Mar 22, 2018 at 8:42 AM, Klaij, Christiaan wrote:
> Matt,
>
> The problem must be earlier, because it should be using mpicc for
> the check, not gcc. The mpi.h is found here by the 3.7.5 config:
>
Yes, I believe we took out the compiler finding stuff since the
In either case
--mpi-dir=/opt/intel/compilers_and_libraries_2017.1.132/linux/mpi is the wrong
option.
Its a shortcut for
--with-cc=/opt/intel/compilers_and_libraries_2017.1.132/linux/mpi/bin/mpicc -
which you
don't have.
Since you have mpicc in your path - you can just use:
--with-cc=mpicc
Matt,
The problem must be earlier, because it should be using mpicc for
the check, not gcc. The mpi.h is found here by the 3.7.5 config:
Executing: mpicc -E -I/tmp/petsc-JIF0WC/config.setCompilers
-I/tmp/petsc-JIF0WC/config.types -I/tmp/petsc-JIF0WC/config.headers
On Thu, Mar 22, 2018 at 8:00 AM, Klaij, Christiaan wrote:
> Satish,
>
> I'm trying to upgrade from 3.7.5 to 3.8.3. The first problem is
> that my intel mpi installation, which works for 3.7.5, fails with
> 3.8.3, see the attached logs. It seems that the mpi compilers are
> not
Fixed
https://bitbucket.org/slepc/slepc/commits/464bcc967aa18470486aba71868e0ae158c3fe49
> El 22 mar 2018, a las 3:23, Satish Balay escribió:
>
> The primary change is - DESTDIR in petscvariables is replaced with PREFIXDIR
>
> i.e:
>
> diff --git
22 matches
Mail list logo