Worthwhile to use PETSc to solve lots of small matrices?

2007-06-17 Thread Ben Tay
if it is worthwhile to use PETSc to solve them or am I better off using those direct guassian elimination solvers? Thanks -- next part -- An HTML attachment was scrubbed... URL: http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20070617/f0c5d3e2/attachment.htm

MatCreateMPIAIJ Pre-allocation Query

2007-06-17 Thread Matthew Knepley
On 6/17/07, Tim Stitt timothy.stitt at ichec.ie wrote: Hi All, Currently I am using MatCreateMPIAIJ to create a distributed sparse matrix for use in my parallel sparse eigensolver. If I understand things correctly, it is important to specify the correct pre-allocation values for (o_nz,d_nz)

MatCreateMPIAIJ Pre-allocation Query

2007-06-17 Thread Barry Smith
Tim, It is possible the macros MatPreallocateInitialize(), ... in petscmat.h are exactly what you need. Barry Take a look at DAGetMatrix2d_MPIAIJ() in src/dm/da/utils/fdda.c for an example of how they can be used. On Sun, 17 Jun 2007, Tim Stitt wrote: Hi All, Currently I am

Is it possible to build one Petsc that works for both C and C++?

2007-06-17 Thread Shi Jin
Great! Thanks a lot! Shi --- Matthew Knepley knepley at gmail.com wrote: This is an MPICH problem. From mpicxx.h: // There is a name conflict between stdio.h and the MPI C++ binding // with respect to the names SEEK_SET, SEEK_CUR, and SEEK_END. MPI // wants these in the MPI namespace,

MatCreateMPIAIJ Pre-allocation Query

2007-06-17 Thread Tim Stitt
Thanks for that Matt...will check it out. Incidentally, I need to pass a 64-bit integer to MatCreateMPIAIJ. Do I need to rebuild PETSc using a 64-bit integer option switch? Tim. On Sunday 17 June 2007 17:07, Matthew Knepley wrote: On 6/17/07, Tim Stitt timothy.stitt at ichec.ie wrote: Hi

MatCreateMPIAIJ Pre-allocation Query

2007-06-17 Thread Barry Smith
If your problem is so big that you need integers that represent numbers more than about 2 billion then yes you will need to rebuild PETSc. Barry You'll be the first PETSc user who solves a problem with more than 2 billion unknowns. On Sun, 17 Jun 2007, Tim Stitt wrote: Thanks for that

MatCreateMPIAIJ Pre-allocation Query

2007-06-17 Thread Tim Stitt
Actually my sparse matrix has 2^32 rows and columns hence the global row and column arguments need to be 64-bit. On Sunday 17 June 2007 17:36, Barry Smith wrote: If your problem is so big that you need integers that represent numbers more than about 2 billion then yes you will need to

MatCreateMPIAIJ Pre-allocation Query

2007-06-17 Thread Tim Stitt
Matt, We are investigating 32-qubit fault-tolerant quantum computation which results in 2^32 basis states and hence our large Hamiltonian matrix. I just ran into problems passing 2^32 in as the global row an column argument so I assumed I need to rebuild PETSc to accept 64-bit integers. Tim.

MatCreateMPIAIJ Pre-allocation Query

2007-06-17 Thread Tim Stitt
Cheers...perfect. On Sunday 17 June 2007 18:16, Matthew Knepley wrote: On 6/17/07, Tim Stitt timothy.stitt at ichec.ie wrote: Matt, We are investigating 32-qubit fault-tolerant quantum computation which results in 2^32 basis states and hence our large Hamiltonian matrix. I just ran

MatCreateMPIAIJ Pre-allocation Query

2007-06-17 Thread Tim Stitt
Matt, Also at the end of the configure I get the following error: characteristic.c:253: error: conflicting types for `CharacteristicSetVelocityInterpolation' /ichec/home/staff/tstitt/builds/petsc/src/contrib/semiLagrange/characteristic.h:47: error: previous declaration of