[petsc-users] Quick Question on /src/ksp/ksp/example/ex29.c

2011-09-13 Thread Jed Brown
On Tue, Sep 13, 2011 at 05:59, Alan Wei zhenglun.wei at gmail.com wrote:

 I'm still working on this poisson solver, and find a small problem.
 I found from the original  code that DMMGGetx(dmmg) can get the
 solution vector after calculation. However, I want to have them as a format
 of 2-dimensional array (since I solve for a 2D Poisson Equation), like
 coors[j][i]. How can I do that?


http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/DM/DMDAVecGetArray.html

See also the section of the users manual regarding structured grids.

If you upgrade to petsc-3.2, you can also do multigrid without DMMG. See
src/ksp/ksp/examples/tutorials/ex45.c for an example. This is now the
preferred interface: we would like to eventually remove DMMG because is more
complicated to use and does not compose well with other solvers, e.g. TS.
-- next part --
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/a9c3e26c/attachment.htm


[petsc-users] [petsc-maint #86567] Question on KSPs

2011-09-13 Thread Michele De Stefano
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/99c031f7/attachment.htm


[petsc-users] Neumann BC for interface elliptic problem

2011-09-13 Thread Marco Cisternino
Hi to everybody,
I'm trying to solve this equation
nabla(k*nabla(u))=f,
with neumann homogeneous  boundary conditions, where k is piecewise 
constant and f is a zero mean source (compatibility is respected).
To do this I solve an extended by transmission conditions linear system, 
obtained with FD second order scheme.
Everything works fine with dirichlet bc and with neumann bc too, if the 
interface, where k jumps, cut the computational domain.
But if the interface is in the middle of the computational domain, 
something wrong happens where the processes overlap, with an overall 
loss of symmetry: the source, the interface and the bc are symmetric.
I use gmres with asm. These are the lines to create the nullspace

   has_cnst=PETSC_TRUE
   call 
MatNullSpaceCreate(MPI_CART_COMM,has_cnst,0,PETSC_NULL_OBJECT,nsppoi,ierr)
   call KSPSetNullSpace(ksppoi,nsppoi,ierr)

Ask me everything you need to better understand the problem.
Could you help me?
Thanks.

 Marco

-- 
Marco Cisternino
PhD Student
Politecnico di Torino
Email:marco.cisternino at polito.it



[petsc-users] [petsc-maint #86567] Question on KSPs

2011-09-13 Thread Matthew Knepley
On Tue, Sep 13, 2011 at 3:03 AM, Michele De Stefano 
mstefano at milan.westerngeco.slb.com wrote:

 **
 Jed and others,

 we would like to explore more in detail the possibility to have a
 completely matrix-free preconditioner.
 May you explain which are the few methods that we should implement and
 that you was talking of ?

 Second question: are there algorithms for creating matrix-free
 preconditioners ?
 If yes, may you suggest at least one, please ?


There is really no conceptual difference between Krylov methods and
preconditioners in that they are
both approximate linear solvers. The distinction was made mostly because
Krylov solvers only need the
action of the matrix (or its transpose) and thus are matrix-free.
Therefore, matrix-free preconditioner
basically means Krylov method (where I include Chebychev, which is a KSP in
PETSc).

So what many scalable solvers do is combine KSPs in a nice way, like a
multigrid iteration.

There may be things you can do for your particular equations, like analytic
simplifications, but these
would not be in PETSc since they are not generic.

  Thanks,

 Matt


 Thank you in advance.
 Best regards,

 Michele

 Jed Brown wrote:

 On Mon, Sep 12, 2011 at 16:28, Michele De Stefano 
 mstefano at milan.westerngeco.slb.com wrote:

 KSPSetOperators is able to accept a shell matrix for the Amat and the
 manual
 says that the Pmat is usually the same as the Amat. Does this really
 work
 also when Amat is a shell matrix ?

 I mean, in the multi-process case, the default preconditioning method is
 block
 Jacobi ... but does this work also when Pmat is a shell matrix ?


 Only if a few methods are implemented for the shell matrix. This is usually
 not practical. It would be more common to assemble some approximation Pmat
 (e.g. in AIJ format) and use that for preconditioning. To do everything
 matrix-free, you would normally have to put some effort into writing your
 own custom preconditioner.


 --
 Michele De Stefano
 Senior Geophysicist - REMS
 Integrated EM Center of Excellence

 WesternGeco GeoSolutions,
 Via Clericetti 42/A,
 20133, Milan - Italy

 +39 02 266279232 (direct)
 +39 02 266279279 (fax)

 mstefano at slb.com

 Schlumberger Private




-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/199805b2/attachment.htm


[petsc-users] Neumann BC for interface elliptic problem

2011-09-13 Thread Matthew Knepley
On Tue, Sep 13, 2011 at 5:56 AM, Marco Cisternino 
marco.cisternino at polito.it wrote:

 Hi to everybody,
 I'm trying to solve this equation
 nabla(k*nabla(u))=f,
 with neumann homogeneous  boundary conditions, where k is piecewise
 constant and f is a zero mean source (compatibility is respected).
 To do this I solve an extended by transmission conditions linear system,
 obtained with FD second order scheme.
 Everything works fine with dirichlet bc and with neumann bc too, if the
 interface, where k jumps, cut the computational domain.
 But if the interface is in the middle of the computational domain,
 something wrong happens where the processes overlap, with an overall loss of
 symmetry: the source, the interface and the bc are symmetric.
 I use gmres with asm. These are the lines to create the nullspace

  has_cnst=PETSC_TRUE
  call MatNullSpaceCreate(MPI_CART_**COMM,has_cnst,0,PETSC_NULL_**
 OBJECT,nsppoi,ierr)
  call KSPSetNullSpace(ksppoi,nsppoi,**ierr)

 Ask me everything you need to better understand the problem.
 Could you help me?


If you have matrices, or submatrices, which must be symmetric, you can check
this using


http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/Mat/MatIsSymmetric.html

  Thanks,

 Matt


 Thanks.

Marco

 --
 Marco Cisternino
 PhD Student
 Politecnico di Torino
 Email:marco.cisternino at polito.**it Email%3Amarco.cisternino at polito.it




-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/8054b03f/attachment.htm


[petsc-users] Building PETSc with cmake

2011-09-13 Thread John Fettig
What are the requirements for building PETSc with cmake?  I have cmake
version 2.8.1 installed, and configure finds it, but it does nothing
with it.  Is it because my python version is too old?  This is on
CentOS 5, I get these messages about cmake in the configure.log (with
3.2-p0):

Checking for program /usr/local/bin/cmake...found
Defined make macro CMAKE to /usr/local/bin/cmake

...

  Skipping cmakegen due to old python version: (2, 4, 3, 'final', 0)
  Skipping cmakeboot due to old python version: (2, 4, 3, 'final', 0)

At the end of the configure, I'm not offered a way to build with
cmake.  I presume that cmake is the only way to do a parallel build?

John


[petsc-users] Building PETSc with cmake

2011-09-13 Thread Matthew Knepley
On Tue, Sep 13, 2011 at 8:55 AM, John Fettig john.fettig at gmail.com wrote:

 What are the requirements for building PETSc with cmake?  I have cmake
 version 2.8.1 installed, and configure finds it, but it does nothing
 with it.  Is it because my python version is too old?  This is on
 CentOS 5, I get these messages about cmake in the configure.log (with
 3.2-p0):

 Checking for program /usr/local/bin/cmake...found
Defined make macro CMAKE to /usr/local/bin/cmake

 ...

  Skipping cmakegen due to old python version: (2, 4, 3, 'final', 0)
  Skipping cmakeboot due to old python version: (2, 4, 3, 'final', 0)

 At the end of the configure, I'm not offered a way to build with
 cmake.  I presume that cmake is the only way to do a parallel build?


Yes, and it is because your Python version is too old. I believe it needs
2.6. I would
install 2.7 if you are upgrading.

   Matt



 John

-- 
What most experimenters take for granted before they begin their experiments
is infinitely more interesting than any results to which their experiments
lead.
-- Norbert Wiener
-- next part --
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/70137259/attachment.htm


[petsc-users] Building PETSc with cmake

2011-09-13 Thread Satish Balay
On Tue, 13 Sep 2011, John Fettig wrote:

 What are the requirements for building PETSc with cmake?  I have cmake
 version 2.8.1 installed, and configure finds it, but it does nothing
 with it.  Is it because my python version is too old?  This is on
 CentOS 5, I get these messages about cmake in the configure.log (with
 3.2-p0):
 
 Checking for program /usr/local/bin/cmake...found
 Defined make macro CMAKE to /usr/local/bin/cmake
 
 ...
 
   Skipping cmakegen due to old python version: (2, 4, 3, 'final', 0)
   Skipping cmakeboot due to old python version: (2, 4, 3, 'final', 0)

yes - cmake part of configure requires python-2.5. Perhaps we should
look at this code and see if it can be made to work with 2.3+ [current
configure requirement]

If configure determines that cmake build won't work [either due to the
python version issue - or some other configue issue - for eg: windows
compilers - it won't offer this build option at tthe end.

 At the end of the configure, I'm not offered a way to build with
 cmake.  I presume that cmake is the only way to do a parallel build?

Yes - currently cmake build is the only way to do parallel builds.

. Also builder.py [if it works for your config] could be faster than
the 'make all-legacy' build.

Satish


[petsc-users] Building PETSc with cmake

2011-09-13 Thread John Fettig
On Tue, Sep 13, 2011 at 9:59 AM, Matthew Knepley knepley at gmail.com wrote:
 On Tue, Sep 13, 2011 at 8:55 AM, John Fettig john.fettig at gmail.com wrote:

 What are the requirements for building PETSc with cmake? ?I have cmake
 version 2.8.1 installed, and configure finds it, but it does nothing
 with it. ?Is it because my python version is too old? ?This is on
 CentOS 5, I get these messages about cmake in the configure.log (with
 3.2-p0):

 Checking for program /usr/local/bin/cmake...found
 ? ? ? ? ? ?Defined make macro CMAKE to /usr/local/bin/cmake

 ...

 ? ? ?Skipping cmakegen due to old python version: (2, 4, 3, 'final', 0)
 ? ? ?Skipping cmakeboot due to old python version: (2, 4, 3, 'final', 0)

 At the end of the configure, I'm not offered a way to build with
 cmake. ?I presume that cmake is the only way to do a parallel build?

 Yes, and it is because your Python version is too old. I believe it needs
 2.6. I would
 install 2.7 if you are upgrading.

Is there any way to tell configure which python to use?  I have python
2.6 installed, it's just not what runs when you type `python`.  I
don't think I can change the default python in CentOS 5 without
breaking a lot of other things.

John


[petsc-users] Building PETSc with cmake

2011-09-13 Thread Satish Balay
On Tue, 13 Sep 2011, John Fettig wrote:

 On Tue, Sep 13, 2011 at 9:59 AM, Matthew Knepley knepley at gmail.com wrote:
  On Tue, Sep 13, 2011 at 8:55 AM, John Fettig john.fettig at gmail.com 
  wrote:
 
  What are the requirements for building PETSc with cmake? ?I have cmake
  version 2.8.1 installed, and configure finds it, but it does nothing
  with it. ?Is it because my python version is too old? ?This is on
  CentOS 5, I get these messages about cmake in the configure.log (with
  3.2-p0):
 
  Checking for program /usr/local/bin/cmake...found
  ? ? ? ? ? ?Defined make macro CMAKE to /usr/local/bin/cmake
 
  ...
 
  ? ? ?Skipping cmakegen due to old python version: (2, 4, 3, 'final', 0)
  ? ? ?Skipping cmakeboot due to old python version: (2, 4, 3, 'final', 0)
 
  At the end of the configure, I'm not offered a way to build with
  cmake. ?I presume that cmake is the only way to do a parallel build?
 
  Yes, and it is because your Python version is too old. I believe it needs
  2.6. I would
  install 2.7 if you are upgrading.
 
 Is there any way to tell configure which python to use?  I have python
 2.6 installed, it's just not what runs when you type `python`.  I
 don't think I can change the default python in CentOS 5 without
 breaking a lot of other things.

/path/to/python2.6 configure  [configure_options]

Satish


[petsc-users] Quick Question on /src/ksp/ksp/example/ex29.c

2011-09-13 Thread Alan Wei
Thanks, Jed. Let me finish this part first and I will upgrade to version 3.2
to try new things.

best,
Alan


On Tue, Sep 13, 2011 at 1:10 AM, Jed Brown jedbrown at mcs.anl.gov wrote:

 On Tue, Sep 13, 2011 at 05:59, Alan Wei zhenglun.wei at gmail.com wrote:

 I'm still working on this poisson solver, and find a small problem.
 I found from the original  code that DMMGGetx(dmmg) can get the
 solution vector after calculation. However, I want to have them as a format
 of 2-dimensional array (since I solve for a 2D Poisson Equation), like
 coors[j][i]. How can I do that?



 http://www.mcs.anl.gov/petsc/petsc-as/snapshots/petsc-current/docs/manualpages/DM/DMDAVecGetArray.html

 See also the section of the users manual regarding structured grids.

 If you upgrade to petsc-3.2, you can also do multigrid without DMMG. See
 src/ksp/ksp/examples/tutorials/ex45.c for an example. This is now the
 preferred interface: we would like to eventually remove DMMG because is more
 complicated to use and does not compose well with other solvers, e.g. TS.

-- next part --
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/9aef0129/attachment.htm


[petsc-users] Building PETSc with cmake

2011-09-13 Thread John Fettig
On Tue, Sep 13, 2011 at 10:09 AM, Satish Balay balay at mcs.anl.gov wrote:
 On Tue, 13 Sep 2011, John Fettig wrote:
 Is there any way to tell configure which python to use? ?I have python
 2.6 installed, it's just not what runs when you type `python`. ?I
 don't think I can change the default python in CentOS 5 without
 breaking a lot of other things.

 /path/to/python2.6 configure ?[configure_options]


This gets configure to try cmake, but it still doesn't use it.  It
doesn't set PETSC_BUILD_USING_CMAKE to 1 as I see the nightly
builds doing, and again doesn't offer for me to use cmake at the end
of configure.  Is this only available in petsc-dev?

Thanks,
John

Invoking: ['/usr/local/bin/cmake',
'/home/jfe/local/centos/petsc-3.2-p0',
'-DCMAKE_C_COMPILER:FILEPATH=icc', '-DCMAKE_C_FLAGS:STRING= -fPIC
-wd1572 -Qoption,cpp,--extended_float_type -g -debug
inline_debug_info', '-DCMAKE_Fortran_COMPILER:FILEPATH=ifort',
'-DCMAKE_Fortran_FLAGS:STRING= -fPIC -g -debug inline_debug_info']
sh: ['/usr/local/bin/cmake', '/home/jfe/local/centos/petsc-3.2-p0',
'-DCMAKE_C_COMPILER:FILEPATH=icc', '-DCMAKE_C_FLAGS:STRING= -fPIC
-wd1572 -Qoption,cpp,--extended_float_type -g -debug
inline_debug_info', '-DCMAKE_Fortran_COMPILER:FILEPATH=ifort',
'-DCMAKE_Fortran_FLAGS:STRING= -fPIC -g -debug inline_debug_info']
Executing: ['/usr/local/bin/cmake',
'/home/jfe/local/centos/petsc-3.2-p0',
'-DCMAKE_C_COMPILER:FILEPATH=icc', '-DCMAKE_C_FLAGS:STRING= -fPIC
-wd1572 -Qoption,cpp,--extended_float_type -g -debug
inline_debug_info', '-DCMAKE_Fortran_COMPILER:FILEPATH=ifort',
'-DCMAKE_Fortran_FLAGS:STRING= -fPIC -g -debug inline_debug_info']
sh: -- The C compiler identification is Intel
-- Check for working C compiler: /opt/intel/Compiler/11.1/072/bin/intel64/icc
-- Check for working C compiler:
/opt/intel/Compiler/11.1/072/bin/intel64/icc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- The Fortran compiler identification is Intel
-- Check for working Fortran compiler:
/opt/intel/Compiler/11.1/072/bin/intel64/ifort
-- Check for working Fortran compiler:
/opt/intel/Compiler/11.1/072/bin/intel64/ifort  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /opt/intel/Compiler/11.1/072/bin/intel64/ifort
supports Fortran 90
-- Checking whether /opt/intel/Compiler/11.1/072/bin/intel64/ifort
supports Fortran 90 -- yes
-- Configuring done
-- Generating done
-- Build files have been written to:
/home/jfe/local/centos/petsc-3.2-p0/intel-debug





[petsc-users] Building PETSc with cmake

2011-09-13 Thread Satish Balay
It should work in petsc-32. Can you send configure.log to petsc-maint?

Satish

On Tue, 13 Sep 2011, John Fettig wrote:

 On Tue, Sep 13, 2011 at 10:09 AM, Satish Balay balay at mcs.anl.gov wrote:
  On Tue, 13 Sep 2011, John Fettig wrote:
  Is there any way to tell configure which python to use? ?I have python
  2.6 installed, it's just not what runs when you type `python`. ?I
  don't think I can change the default python in CentOS 5 without
  breaking a lot of other things.
 
  /path/to/python2.6 configure ?[configure_options]
 
 
 This gets configure to try cmake, but it still doesn't use it.  It
 doesn't set PETSC_BUILD_USING_CMAKE to 1 as I see the nightly
 builds doing, and again doesn't offer for me to use cmake at the end
 of configure.  Is this only available in petsc-dev?
 
 Thanks,
 John
 
 Invoking: ['/usr/local/bin/cmake',
 '/home/jfe/local/centos/petsc-3.2-p0',
 '-DCMAKE_C_COMPILER:FILEPATH=icc', '-DCMAKE_C_FLAGS:STRING= -fPIC
 -wd1572 -Qoption,cpp,--extended_float_type -g -debug
 inline_debug_info', '-DCMAKE_Fortran_COMPILER:FILEPATH=ifort',
 '-DCMAKE_Fortran_FLAGS:STRING= -fPIC -g -debug inline_debug_info']
 sh: ['/usr/local/bin/cmake', '/home/jfe/local/centos/petsc-3.2-p0',
 '-DCMAKE_C_COMPILER:FILEPATH=icc', '-DCMAKE_C_FLAGS:STRING= -fPIC
 -wd1572 -Qoption,cpp,--extended_float_type -g -debug
 inline_debug_info', '-DCMAKE_Fortran_COMPILER:FILEPATH=ifort',
 '-DCMAKE_Fortran_FLAGS:STRING= -fPIC -g -debug inline_debug_info']
 Executing: ['/usr/local/bin/cmake',
 '/home/jfe/local/centos/petsc-3.2-p0',
 '-DCMAKE_C_COMPILER:FILEPATH=icc', '-DCMAKE_C_FLAGS:STRING= -fPIC
 -wd1572 -Qoption,cpp,--extended_float_type -g -debug
 inline_debug_info', '-DCMAKE_Fortran_COMPILER:FILEPATH=ifort',
 '-DCMAKE_Fortran_FLAGS:STRING= -fPIC -g -debug inline_debug_info']
 sh: -- The C compiler identification is Intel
 -- Check for working C compiler: /opt/intel/Compiler/11.1/072/bin/intel64/icc
 -- Check for working C compiler:
 /opt/intel/Compiler/11.1/072/bin/intel64/icc -- works
 -- Detecting C compiler ABI info
 -- Detecting C compiler ABI info - done
 -- The Fortran compiler identification is Intel
 -- Check for working Fortran compiler:
 /opt/intel/Compiler/11.1/072/bin/intel64/ifort
 -- Check for working Fortran compiler:
 /opt/intel/Compiler/11.1/072/bin/intel64/ifort  -- works
 -- Detecting Fortran compiler ABI info
 -- Detecting Fortran compiler ABI info - done
 -- Checking whether /opt/intel/Compiler/11.1/072/bin/intel64/ifort
 supports Fortran 90
 -- Checking whether /opt/intel/Compiler/11.1/072/bin/intel64/ifort
 supports Fortran 90 -- yes
 -- Configuring done
 -- Generating done
 -- Build files have been written to:
 /home/jfe/local/centos/petsc-3.2-p0/intel-debug
 
 
 
 


[petsc-users] Neumann BC for interface elliptic problem

2011-09-13 Thread Marco Cisternino
Thanks, Matthew, for the quick reply.
My matrices are not symmetric. The differential problem is symmetric, 
but matrices are not.
And not only for the presence of transmission conditions at the 
interface but also because I explicitly put the discretization of the 
forward or backward second order Neumann boundary conditions as rows in 
my matrix. Then even if I try to solve a problem without interfaces my 
matrix is not symmetric. And even in this case I got problems along the 
processes boundaries and about the loss of  symmetry of the solution.
But the matrix structure is exactly the same when I solve the problem 
with the interface cutting the domain (always with the same 
implementation of Neumann boundaries conditions) with no problem.
I can eliminate the bc rows nesting them in the discretization of the 
laplacian, but the interface will always give me an asymmetric matrix.
But most of all I can't understand why for one kind of interface 
everything works and for the other not.
Thanks again.

 Marco



[petsc-users] Neumann BC for interface elliptic problem

2011-09-13 Thread Barry Smith

On Sep 13, 2011, at 10:58 AM, Marco Cisternino wrote:

 Thanks, Matthew, for the quick reply.
 My matrices are not symmetric. The differential problem is symmetric, but 
 matrices are not.
 And not only for the presence of transmission conditions at the interface but 
 also because I explicitly put the discretization of the forward or backward 
 second order Neumann boundary conditions as rows in my matrix. Then even if I 
 try to solve a problem without interfaces my matrix is not symmetric. And 
 even in this case I got problems along the processes boundaries and about the 
 loss of  symmetry of the solution.
 But the matrix structure is exactly the same when I solve the problem with 
 the interface cutting the domain (always with the same implementation of 
 Neumann boundaries conditions) with no problem.
 I can eliminate the bc rows nesting them in the discretization of the 
 laplacian, but the interface will always give me an asymmetric matrix.
 But most of all I can't understand why for one kind of interface everything 
 works

   What is everything that works? Do you mean the iterative solver converges 
in one case but not the other? Do you mean it always converges but the answer 
is wrong in one case but not the other?

Barry

 and for the other not.
 Thanks again.
 
Marco
 



[petsc-users] Non-official C++ binding for PETSc

2011-09-13 Thread Jack Poulson
Hi Caner,

Is the templating more than aesthetic? It would seem to me from, for
instance,
https://github.com/canercandan/petsc-cxx/blob/master/src/petsc_cxx/Vector.h
that only a single datatype is supported for your Vec and Matrix wrappers
(PetscScalar).

So, for example, in this test
https://github.com/canercandan/petsc-cxx/blob/master/test/t-create_vector_cxx.cpp
could the Vector's x and y instead be built on top of int rather than
Scalar? If not, why bother with the template parameter? It would seem that
the main advantage of these wrappers is the syntactic sugar for printing.

Jack

On Tue, Sep 13, 2011 at 5:58 AM, Caner Candan caner at candan.fr wrote:

 Hi PETSc users,

 Let's introduce a non-official C++ version of PETSc developed during
 my MSc's degree, this work does not include all the components of the
 library, however you can already find some operators like KSPSolver,
 MultiplyMatVec and Dot. The framework uses a functors-based design
 requiring all components to define an operator function. There are
 also a Matrix and Vector data structure classes.

 The sources are available as free software under the terms of the GNU
 GPL license at https://github.com/canercandan/petsc-cxx

 Have a look also into the test folder in order to see how it works.
 (https://github.com/canercandan/petsc-cxx/tree/master/test)

 To build the project, kindly read the README file at
 https://github.com/canercandan/petsc-cxx/blob/master/README

 BR,
 Caner Candan

-- next part --
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/346af772/attachment.htm


[petsc-users] [petsc-maint #86567] Question on KSPs

2011-09-13 Thread Jed Brown
On Tue, Sep 13, 2011 at 10:03, Michele De Stefano 
mstefano at milan.westerngeco.slb.com wrote:

 we would like to explore more in detail the possibility to have a
 completely matrix-free preconditioner.
 May you explain which are the few methods that we should implement and
 that you was talking of ?


Matrix-free multigrid, approximate the problem with a simpler model for
which a fast solver (e.g. FMM or FFT-based) is available, the nested
iteration that Matt mentioned, low rank + identity approximations of the
inverse (e.g. built as a byproduct of an outer iteration like BFGS).

None of these are truly generic. I absolutely do not recommend trying to
build matrix-free preconditioners to avoid writing code to assemble a
matrix. Indeed, I think it's well worth having the code to assemble a matrix
even if you usually run without it because it enables you to try many
algorithms without extra effort. Having the matrix available can inform the
design of matrix-free methods.

The performance asymptotics of matrix-free methods are better for some
problems. In many cases, it makes sense to apply the Jacobian using some
matrix-free method and assemble a less expensive matrix for preconditioning.
-- next part --
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/1627dab4/attachment.htm


[petsc-users] Problem with PETSc-3.2 configuration

2011-09-13 Thread Rongliang Chen
 mumps_static_mapping.F -o  mumps_static_mapping.o
make[3]: Leaving directory
`/home/ronglian/soft/petsc-3.2-p1/externalpackages/MUMPS_4.10.0/src'
make[2]: Leaving directory
`/home/ronglian/soft/petsc-3.2-p1/externalpackages/MUMPS_4.10.0/src'
make[1]: Leaving directory
`/home/ronglian/soft/petsc-3.2-p1/externalpackages/MUMPS_4.10.0'
mumps_io_basic.c: In function ?mumps_init_file_name?:
mumps_io_basic.c:564: warning: assignment discards qualifiers from pointer
target type
mumps_static_mapping.F: In function ?mumps_369?:
mumps_static_mapping.F:333: internal compiler error: in modified_type_die,
at dwarf2out.c:8495
Please submit a full bug report,
with preprocessed source if appropriate.
See URL:http://bugzilla.redhat.com/bugzilla for instructions.
make[3]: *** [mumps_static_mapping.o] Error 1
make[2]: *** [s] Error 2
make[1]: *** [mumps_lib] Error 2
make: *** [s] Error 2
***
-- next part --
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/f8f88cd9/attachment.htm


[petsc-users] Problem with PETSc-3.2 configuration

2011-09-13 Thread Satish Balay
On Tue, 13 Sep 2011, Jed Brown wrote:

 On Tue, Sep 13, 2011 at 23:26, Rongliang Chen rongliang.chan at 
 gmail.comwrote:
 
  mumps_static_mapping.F: In function ?mumps_369?:
  mumps_static_mapping.F:333: internal compiler error: in modified_type_die,
  at dwarf2out.c:8495
  Please submit a full bug report,
  with preprocessed source if appropriate.
  See URL:http://bugzilla.redhat.com/bugzilla for instructions.
 
 
 Um, internal compiler error. File a bug report with your vendor and/or
 upgrade your compilers?

What version of compilers are you using?

mpicc -show
mpif90 -show
mpicc --version
mpif90 --version

Satish


[petsc-users] Problem with PETSc-3.2 configuration

2011-09-13 Thread Rongliang Chen
Hi Jed and Satish,

Thank you for your reply.

Following is the information of the compilers.
mpicc -show

gcc -I/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/include -pthread
-L/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/lib -lmpi -lopen-rte
-lopen-pal -ldl -Wl,--export-dynamic -lnsl -lutil -lm -ldl

mpif90 -show

gfortran -I/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/include
-pthread -I/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/lib
-L/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/lib -lmpi_f90 -lmpi_f77
-lmpi -lopen-rte -lopen-pal -ldl -Wl,--export-dynamic -lnsl -lutil -lm -ldl

mpicc --version

gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-50)
Copyright (C) 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

mpif90 --version

GNU Fortran (GCC) 4.1.2 20080704 (Red Hat 4.1.2-50)
Copyright (C) 2007 Free Software Foundation, Inc.

GNU Fortran comes with NO WARRANTY, to the extent permitted by law.
You may redistribute copies of GNU Fortran
under the terms of the GNU General Public License.
For more information about these matters, see the file named COPYING


Best,
Rongliang


--

 Message: 5
 Date: Tue, 13 Sep 2011 23:29:56 +0200
 From: Jed Brown jedbrown at mcs.anl.gov
 Subject: Re: [petsc-users] Problem with PETSc-3.2 configuration
 To: PETSc users list petsc-users at mcs.anl.gov
 Message-ID:
CAM9tzSmy4q3u2kqNp_gVj3BQ8S65Zc79v3M4ddFoL5f0BHwZ+g at mail.gmail.com
 
 Content-Type: text/plain; charset=utf-8

 On Tue, Sep 13, 2011 at 23:26, Rongliang Chen rongliang.chan at gmail.com
 wrote:

  mumps_static_mapping.F: In function ?mumps_369?:
  mumps_static_mapping.F:333: internal compiler error: in
 modified_type_die,
  at dwarf2out.c:8495
  Please submit a full bug report,
  with preprocessed source if appropriate.
  See URL:http://bugzilla.redhat.com/bugzilla for instructions.
 

 Um, internal compiler error. File a bug report with your vendor and/or
 upgrade your compilers?
 -- next part --
 An HTML attachment was scrubbed...
 URL: 
 http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/eced13ed/attachment-0001.htm
 

 --

 Message: 6
 Date: Tue, 13 Sep 2011 16:43:04 -0500 (CDT)
 From: Satish Balay balay at mcs.anl.gov
 Subject: Re: [petsc-users] Problem with PETSc-3.2 configuration
 To: PETSc users list petsc-users at mcs.anl.gov
 Message-ID: alpine.LFD.2.02.1109131641320.2462 at asterix
 Content-Type: text/plain; charset=iso-8859-7

 On Tue, 13 Sep 2011, Jed Brown wrote:

  On Tue, Sep 13, 2011 at 23:26, Rongliang Chen rongliang.chan at gmail.com
 wrote:
 
   mumps_static_mapping.F: In function ?mumps_369?:
   mumps_static_mapping.F:333: internal compiler error: in
 modified_type_die,
   at dwarf2out.c:8495
   Please submit a full bug report,
   with preprocessed source if appropriate.
   See URL:http://bugzilla.redhat.com/bugzilla for instructions.
  
 
  Um, internal compiler error. File a bug report with your vendor and/or
  upgrade your compilers?

 What version of compilers are you using?

 mpicc -show
 mpif90 -show
 mpicc --version
 mpif90 --version

 Satish

 --

 ___
 petsc-users mailing list
 petsc-users at mcs.anl.gov
 https://lists.mcs.anl.gov/mailman/listinfo/petsc-users


 End of petsc-users Digest, Vol 33, Issue 36
 ***

-- next part --
An HTML attachment was scrubbed...
URL: 
http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/5b346ec3/attachment.htm


[petsc-users] Problem with PETSc-3.2 configuration

2011-09-13 Thread Satish Balay
Ok - this is on a RHEL5  box. And I'm able to reproduce the internal compiler 
error.

 mumps_static_mapping.F:333: internal compiler error: in modified_type_die, at 
 dwarf2out.c:8495

The error is in debug symbol handler in the compiler. Looks like you
can workarround this by using optized build. i.e

--with-debugging=0

or debug build with:

[default --with-debugging=1 ] FOPTFLAGS='-g -O'

Satish

On Tue, 13 Sep 2011, Rongliang Chen wrote:

 Hi Jed and Satish,
 
 Thank you for your reply.
 
 Following is the information of the compilers.
 mpicc -show
 
 gcc -I/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/include -pthread
 -L/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/lib -lmpi -lopen-rte
 -lopen-pal -ldl -Wl,--export-dynamic -lnsl -lutil -lm -ldl
 
 mpif90 -show
 
 gfortran -I/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/include
 -pthread -I/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/lib
 -L/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/lib -lmpi_f90 -lmpi_f77
 -lmpi -lopen-rte -lopen-pal -ldl -Wl,--export-dynamic -lnsl -lutil -lm -ldl
 
 mpicc --version
 
 gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-50)
 Copyright (C) 2006 Free Software Foundation, Inc.
 This is free software; see the source for copying conditions.  There is NO
 warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
 
 mpif90 --version
 
 GNU Fortran (GCC) 4.1.2 20080704 (Red Hat 4.1.2-50)
 Copyright (C) 2007 Free Software Foundation, Inc.
 
 GNU Fortran comes with NO WARRANTY, to the extent permitted by law.
 You may redistribute copies of GNU Fortran
 under the terms of the GNU General Public License.
 For more information about these matters, see the file named COPYING
 
 
 Best,
 Rongliang
 
 
 --
 
  Message: 5
  Date: Tue, 13 Sep 2011 23:29:56 +0200
  From: Jed Brown jedbrown at mcs.anl.gov
  Subject: Re: [petsc-users] Problem with PETSc-3.2 configuration
  To: PETSc users list petsc-users at mcs.anl.gov
  Message-ID:
 CAM9tzSmy4q3u2kqNp_gVj3BQ8S65Zc79v3M4ddFoL5f0BHwZ+g at 
  mail.gmail.com
  
  Content-Type: text/plain; charset=utf-8
 
  On Tue, Sep 13, 2011 at 23:26, Rongliang Chen rongliang.chan at gmail.com
  wrote:
 
   mumps_static_mapping.F: In function ?mumps_369?:
   mumps_static_mapping.F:333: internal compiler error: in
  modified_type_die,
   at dwarf2out.c:8495
   Please submit a full bug report,
   with preprocessed source if appropriate.
   See URL:http://bugzilla.redhat.com/bugzilla for instructions.
  
 
  Um, internal compiler error. File a bug report with your vendor and/or
  upgrade your compilers?
  -- next part --
  An HTML attachment was scrubbed...
  URL: 
  http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/eced13ed/attachment-0001.htm
  
 
  --
 
  Message: 6
  Date: Tue, 13 Sep 2011 16:43:04 -0500 (CDT)
  From: Satish Balay balay at mcs.anl.gov
  Subject: Re: [petsc-users] Problem with PETSc-3.2 configuration
  To: PETSc users list petsc-users at mcs.anl.gov
  Message-ID: alpine.LFD.2.02.1109131641320.2462 at asterix
  Content-Type: text/plain; charset=iso-8859-7
 
  On Tue, 13 Sep 2011, Jed Brown wrote:
 
   On Tue, Sep 13, 2011 at 23:26, Rongliang Chen rongliang.chan at gmail.com
  wrote:
  
mumps_static_mapping.F: In function ?mumps_369?:
mumps_static_mapping.F:333: internal compiler error: in
  modified_type_die,
at dwarf2out.c:8495
Please submit a full bug report,
with preprocessed source if appropriate.
See URL:http://bugzilla.redhat.com/bugzilla for instructions.
   
  
   Um, internal compiler error. File a bug report with your vendor and/or
   upgrade your compilers?
 
  What version of compilers are you using?
 
  mpicc -show
  mpif90 -show
  mpicc --version
  mpif90 --version
 
  Satish
 
  --
 
  ___
  petsc-users mailing list
  petsc-users at mcs.anl.gov
  https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
 
 
  End of petsc-users Digest, Vol 33, Issue 36
  ***
 
 



[petsc-users] Problem with PETSc-3.2 configuration

2011-09-13 Thread Barry Smith

  Also that gfortran compiler is terribly old, you really should install the 
4.6 version of the Gnu compiler suite instead of living with that old stuff.

   Barry

On Sep 13, 2011, at 6:13 PM, Satish Balay wrote:

 Ok - this is on a RHEL5  box. And I'm able to reproduce the internal compiler 
 error.
 
 mumps_static_mapping.F:333: internal compiler error: in modified_type_die, 
 at dwarf2out.c:8495
 
 The error is in debug symbol handler in the compiler. Looks like you
 can workarround this by using optized build. i.e
 
 --with-debugging=0
 
 or debug build with:
 
 [default --with-debugging=1 ] FOPTFLAGS='-g -O'
 
 Satish
 
 On Tue, 13 Sep 2011, Rongliang Chen wrote:
 
 Hi Jed and Satish,
 
 Thank you for your reply.
 
 Following is the information of the compilers.
 mpicc -show
 
 gcc -I/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/include -pthread
 -L/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/lib -lmpi -lopen-rte
 -lopen-pal -ldl -Wl,--export-dynamic -lnsl -lutil -lm -ldl
 
 mpif90 -show
 
 gfortran -I/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/include
 -pthread -I/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/lib
 -L/curc/tools/free/redhat_5_x86_64/openmpi-1.4.3_ib/lib -lmpi_f90 -lmpi_f77
 -lmpi -lopen-rte -lopen-pal -ldl -Wl,--export-dynamic -lnsl -lutil -lm -ldl
 
 mpicc --version
 
 gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-50)
 Copyright (C) 2006 Free Software Foundation, Inc.
 This is free software; see the source for copying conditions.  There is NO
 warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
 
 mpif90 --version
 
 GNU Fortran (GCC) 4.1.2 20080704 (Red Hat 4.1.2-50)
 Copyright (C) 2007 Free Software Foundation, Inc.
 
 GNU Fortran comes with NO WARRANTY, to the extent permitted by law.
 You may redistribute copies of GNU Fortran
 under the terms of the GNU General Public License.
 For more information about these matters, see the file named COPYING
 
 
 Best,
 Rongliang
 
 
 --
 
 Message: 5
 Date: Tue, 13 Sep 2011 23:29:56 +0200
 From: Jed Brown jedbrown at mcs.anl.gov
 Subject: Re: [petsc-users] Problem with PETSc-3.2 configuration
 To: PETSc users list petsc-users at mcs.anl.gov
 Message-ID:
   CAM9tzSmy4q3u2kqNp_gVj3BQ8S65Zc79v3M4ddFoL5f0BHwZ+g at mail.gmail.com
 
 Content-Type: text/plain; charset=utf-8
 
 On Tue, Sep 13, 2011 at 23:26, Rongliang Chen rongliang.chan at gmail.com
 wrote:
 
 mumps_static_mapping.F: In function ?mumps_369?:
 mumps_static_mapping.F:333: internal compiler error: in
 modified_type_die,
 at dwarf2out.c:8495
 Please submit a full bug report,
 with preprocessed source if appropriate.
 See URL:http://bugzilla.redhat.com/bugzilla for instructions.
 
 
 Um, internal compiler error. File a bug report with your vendor and/or
 upgrade your compilers?
 -- next part --
 An HTML attachment was scrubbed...
 URL: 
 http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20110913/eced13ed/attachment-0001.htm
 
 
 --
 
 Message: 6
 Date: Tue, 13 Sep 2011 16:43:04 -0500 (CDT)
 From: Satish Balay balay at mcs.anl.gov
 Subject: Re: [petsc-users] Problem with PETSc-3.2 configuration
 To: PETSc users list petsc-users at mcs.anl.gov
 Message-ID: alpine.LFD.2.02.1109131641320.2462 at asterix
 Content-Type: text/plain; charset=iso-8859-7
 
 On Tue, 13 Sep 2011, Jed Brown wrote:
 
 On Tue, Sep 13, 2011 at 23:26, Rongliang Chen rongliang.chan at gmail.com
 wrote:
 
 mumps_static_mapping.F: In function ?mumps_369?:
 mumps_static_mapping.F:333: internal compiler error: in
 modified_type_die,
 at dwarf2out.c:8495
 Please submit a full bug report,
 with preprocessed source if appropriate.
 See URL:http://bugzilla.redhat.com/bugzilla for instructions.
 
 
 Um, internal compiler error. File a bug report with your vendor and/or
 upgrade your compilers?
 
 What version of compilers are you using?
 
 mpicc -show
 mpif90 -show
 mpicc --version
 mpif90 --version
 
 Satish
 
 --
 
 ___
 petsc-users mailing list
 petsc-users at mcs.anl.gov
 https://lists.mcs.anl.gov/mailman/listinfo/petsc-users
 
 
 End of petsc-users Digest, Vol 33, Issue 36
 ***