Re: [petsc-users] Solving tridiagonal hermitian generalized eigenvalue problem with SLEPC

2014-08-18 Thread Hong
data structure for A and B would make MatAXPY faster, but may not affect much of performance if you only use a single shift, e.g., ex13.c. Hong Error messages: identical for KRYLOV-SCHUR and ARNOLDI [0]PETSC ERROR: - Error Message [0

Re: [petsc-users] Solving tridiagonal hermitian generalized eigenvalue problem with SLEPC

2014-08-18 Thread Hong
Toon : Sorry, I forgot to mention: I am looking for the Eigenvalues that are the largest, not in absolute value, but along the real axis. Then, you do not need shift-invert, therefore, should not use LU matrix factorization. Hong On 18 August 2014 21:54, Toon Weyens twey...@fis.uc3m.es

Re: [petsc-users] Solving tridiagonal hermitian generalized eigenvalue problem with SLEPC

2014-08-19 Thread Hong
start searching for efficient solvers among those working ones obtained in step 2. Hong On Tue, Aug 19, 2014 at 10:38 AM, Jed Brown j...@jedbrown.org wrote: Toon Weyens twey...@fis.uc3m.es writes: Yes, you are probably right: My code is not yet bug free (by all means!). However, I have been

Re: [petsc-users] Convert matrix to 1-D array

2014-08-21 Thread Hong
You can call MatGetSubMatrices() http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatGetSubMatrices.html Hong On Thu, Aug 21, 2014 at 6:25 PM, priyank patel priyankpate...@live.com wrote: hi, Ya this will give me only local entries. But what if I want the complete matrix

Re: [petsc-users] Operations available for different types?

2014-08-26 Thread Hong
Murat, Do you need MatGetRowMinAbs() for dftb eigenvalue problem? Hong On Tue, Aug 26, 2014 at 9:49 PM, Barry Smith bsm...@mcs.anl.gov wrote: On Aug 26, 2014, at 9:35 PM, murat keçeli kec...@gmail.com wrote: Since sbaij only stores the above diagonal half of the matrix getting the row

Re: [petsc-users] PETSC errors from KSPSolve() with MUMPS

2014-08-28 Thread Hong
Evan, Please comment out your own mumps parameters and run the code with the default icnt and ival. Does it still crash? If so, please send us entire error message. It it common to get memory error in numerical factorization of mumps. I've rarely seen error occurs in the symbolic phase. Hong

Re: [petsc-users] superlu_dist and MatSolveTranspose

2014-08-29 Thread Hong
We can add MatSolveTranspose() to the petsc interface with superlu_dist. Jed, Are you working on it? If not, I can work on it. Hong On Fri, Aug 29, 2014 at 6:14 PM, Gaetan Kenway gaet...@gmail.com wrote: Hi Antoine We are also using PETSc for solving adjoint systems resulting from CFD

Re: [petsc-users] Valgrind Errors

2014-09-12 Thread Hong
I'll check it. Hong On Fri, Sep 12, 2014 at 3:40 PM, Dominic Meiser dmei...@txcorp.com wrote: On 09/12/2014 02:11 PM, Barry Smith wrote: James (and Hong), Do you ever see this problem in parallel runs? You are not doing anything wrong. Here is what is happening

Re: [petsc-users] Valgrind Errors

2014-09-15 Thread Hong
James : I'm fixing it in branch hzhang/matmatmult-bugfix https://bitbucket.org/petsc/petsc/commits/a7c7454dd425191f4a23aa5860b8c6bac03cfd7b Once it is further cleaned, and other routines are checked, I will patch petsc-release. Hong Hi Barry, Thanks for the response. You're right, it (both

Re: [petsc-users] superlu_dist and MatSolveTranspose

2014-09-22 Thread Hong
I'll add it. It would not take too long, just matter of priority. I'll try to get it done in a day or two, then let you know when it works. Hong On Mon, Sep 22, 2014 at 12:11 PM, Antoine De Blois antoine.debl...@aero.bombardier.com wrote: Dear all, Sorry for the delay on this topic. Thank

Re: [petsc-users] Valgrind Errors

2014-09-23 Thread Hong
James, The fix is pushed to petsc-maint (release) https://bitbucket.org/petsc/petsc/commits/c974faeda5a26542265b90934a889773ab380866 Thanks for your report! Hong On Mon, Sep 15, 2014 at 5:05 PM, Hong hzh...@mcs.anl.gov wrote: James : I'm fixing it in branch hzhang/matmatmult-bugfix https

Re: [petsc-users] superlu_dist and MatSolveTranspose

2014-09-23 Thread Hong
Antoine, I just find out that superlu_dist does not support MatSolveTransport yet (see Sherry's email below). Once superlu_dist provides this support, we can add it to the petsc/superlu_dist interface. Thanks for your patience. Hong --- Hong, Sorry

Re: [petsc-users] KSP_DIVERGED_INDEFINITE_PC message even with PCFactorSetShiftType

2014-09-29 Thread Hong
to mumps developer about it. Hong

Re: [petsc-users] Handling dense matrices in RBF interpolation

2014-10-24 Thread Hong
input block size bs=n, which I guess should be 1. Hong results in: [0]PETSC ERROR: Arguments are incompatible [0]PETSC ERROR: Cannot change block size 1 to 503 What would you advise me to do here? How can I make the KSPSolve faster when it is a dense matrix? I could guess

Re: [petsc-users] passing solver options in fieldsplit

2014-11-11 Thread Hong
Luc: Run your code with option '-help |grep mumps', then you'll see what prefix should be used in your case with the mumps option '-mat_mumps_icntl_14 30'. You may try even larger icntl_14. Hong Hi, I am using Petsc to solver a multiphysics problem and I have the following issue. I partition

Re: [petsc-users] Verifying ParMetis used

2014-11-17 Thread Hong
analysis and ictnl(29) ordering (None) -mat_mumps_icntl_29 0: ICNTL(29): parallel ordering 1 = ptscotch 2 = parmetis (None) Hong On Mon, Nov 17, 2014 at 9:12 AM, Mark Adams mfad...@lbl.gov wrote: I have a code that repartitions the grid using PETSc. I would like to know the simplest

Re: [petsc-users] Verifying ParMetis used

2014-11-17 Thread Hong
via mumps. Hong Mark On Mon, Nov 17, 2014 at 10:46 AM, Hong hzh...@mcs.anl.gov wrote: Mark, I use ParMetis with mumps direct solver. I can test their installation with petsc/src/ksp/ksp/examples/tutorials mpiexec -n 2 ./ex2 -pc_type lu -pc_factor_mat_solver_package mumps

Re: [petsc-users] Transpose of rectangular dense parallel matrix

2015-02-03 Thread Hong
Ghosh: For parallel dense matrix-matrix operations, suggest using Elemental package http://libelemental.org Hong I am trying to calculate the transpose of a dense rectangular matrix (pSddft-YOrb, size=Npts x Nstates) and then MatMatMult I am creating the dense matrix first of size

Re: [petsc-users] Large rectangular Dense Transpose multiplication with sparse

2015-02-05 Thread Hong
Swarnava: The matrix product A will be a dense matrix. You may consider using Elemental package for such matrix product. Hong Dear all, I am trying to compute matrices A = transpose(R)*H*R and M = transpose(R)*R where H is a sparse (banded) matrix in MATMPIAIJ format (5 million x 5

Re: [petsc-users] Large rectangular Dense Transpose multiplication with sparse

2015-02-06 Thread Hong
:-) Hong -- *From: *Hong hzh...@mcs.anl.gov *To: *Swarnava Ghosh sghosh2...@gatech.edu *Cc: *PETSc users list petsc-users@mcs.anl.gov *Sent: *Thursday, February 5, 2015 8:22:13 PM *Subject: *Re: [petsc-users] Large rectangular Dense Transpose multiplication

Re: [petsc-users] Monte carlo for eigenvalue problem in paralle

2015-01-21 Thread Hong
take a look at Elemental and check which eigenvalue routine would work for your problem. We may add it to the interface if it does not take much of effort. Hong On Wed, Jan 21, 2015 at 9:55 AM, Luc Berger-Vergiat lb2...@columbia.edu wrote: You can also look into SLEPc to compute eigenvalues

Re: [petsc-users] Least Square solver LSQR

2015-01-06 Thread Hong
Natacha: I can repeat the error with your ex1f.F. The lsqr solver in PETSc was contributed by a user a decade ago. I'll read the original algorithm and investigate it. I'll let you know the result. Hong Dear PETSc users, I am trying to solve an over determined linear system of equations Ax

Re: [petsc-users] Sparse triangular solver

2015-03-08 Thread Hong
factorization, e.g., ILU, instead of full factorization? The backward subs are steps AFTER matrix factorization. Hong On Mar 8, 2015 6:26 PM, Barry Smith bsm...@mcs.anl.gov wrote: PETSc provides sparse parallel LU (and Cholesky) factorizations and solves via the external packages

Re: [petsc-users] Simple question regarding dense matrix algebra using elemental

2015-04-13 Thread Hong
, allocated nonzeros=334000 total number of mallocs used during MatSetValues calls =0 Elemental run parameters: allocated entries=334000 grid height=1, grid width=3 linear system matrix = precond matrix: ... Everything looks correct. Hong On Mon, Apr

Re: [petsc-users] call lapack subroutines in petsc

2015-04-23 Thread Hong
Wen: Petsc-Lapack interface functions are listed in petsc/include/petscblaslapck.h. I do not see dggev there. You may add such interface yourself, or call it via SLEPc. Hong Could any one tell me how to call a lapack subroutine in PETSc? I would like to use dggev to calculate a generalized

Re: [petsc-users] SBAIJ: Distribution of rows

2015-04-27 Thread Hong
matrix, this distribution is well-balanced. User can set their own distribution by input local rows and set global rows as PETSC_DECIDE. Hong

Re: [petsc-users] “PETSC ERROR: Unable to find requested PCtype hypre”

2015-04-27 Thread Hong
Carol, Have you built your petsc with hypre? You can use ''--download-hypre' during petsc configuration. Hong On Mon, Apr 27, 2015 at 9:28 AM, carol.brick...@awe.co.uk wrote: Hi, I am trying to run an executable built with petsc 3.5.3 and hypre 2.9.0b with flags “-pc_type hypre –ksp_type cg

Re: [petsc-users] Getting unsupported type message from MatCholeskyFactor

2015-05-04 Thread Hong
supports parallel LU, not Cholesky. Hong

Re: [petsc-users] Is matrix analysis available in PETSc or external package?

2015-05-11 Thread Hong
know what solvers being used. My guess is the default gmres/bjacobi/ilu(0). Please run your code with option '-ts_view' or '-snes_view' to find out. Hong

Re: [petsc-users] MUMPS error

2015-05-18 Thread Hong
be scalable? The matrix factors for np=2 and 8 might be very different. We would like to know what mumps' developer say about it. Hong Hi, I have emailed the mumps-user list. Actually the cluster has 8 nodes with 16 cores, and other codes scale well. I wanted to ask if this job takes much time

Re: [petsc-users] petsc4py and mumps (or other direct sparse solver)

2015-04-15 Thread Hong
. Hong Dear petsc-users, I am a beginner in petsc and I have some question with the python interface. I am trying to solve problem with mumps (or other direct sparse solver) I have written the following piece of code ksp = PETSc.KSP() ksp.create(PETSc.COMM_WORLD) ksp.setOperators

Re: [petsc-users] Simple question regarding dense matrix algebra using elemental

2015-04-12 Thread Hong
options mpiexec -n 3 ./ex2 -pc_type lu -pc_factor_mat_solver_package elemental -mat_type elemental Norm of error 2.81086e-15 iterations 1 Please using petsc-dev (master branch) for petsc-elemental interface. Hong On Sun, Apr 12, 2015 at 6:57 PM, Preyas Shah shah.pre...@gmail.com wrote: Hi, I

Re: [petsc-users] MatMatMult with dense matrices.

2015-04-05 Thread Hong
what goes wrong? Hong On Sun, Apr 5, 2015 at 3:44 PM, Barry Smith bsm...@mcs.anl.gov wrote: We would need to see the PETSc side of the code to see if there is anything wrong there. On Apr 5, 2015, at 3:35 PM, James A Charles charl...@purdue.edu wrote: Hi Hong, You can open up

Re: [petsc-users] MatMatMult with dense matrices.

2015-04-07 Thread Hong
James: Thanks a lot for looking into this. I'm still working on debugging this on our side. It might be an issue with us. I will keep you updated. Take your time. Hong - Original Message - From: Hong hzh...@mcs.anl.gov To: Barry Smith bsm...@mcs.anl.gov Cc: James A Charles

Re: [petsc-users] MatMatMult with dense matrices.

2015-04-05 Thread Hong
information we convert the previous matrix that A is formed of A2 (A = A1*A2) to dense prior to the multiplication using MatConvert. It seems both A and B are dense, complex square matrices. Did you call MatMatMult() in sequential or parallel? What matrix format did you use? Hong

Re: [petsc-users] Exploiting symmetry with direct solvers

2015-04-08 Thread Hong
, and symmetric+spd matrices. You may consult mumps user manual. - I gather that SuperLU doesn't provide a symmetric factorization. SuperLU does not support Cholesky factorization. Hong

Re: [petsc-users] Exploiting symmetry with direct solvers

2015-04-08 Thread Hong
^T factorization. See MatGetFactor_xxx_mumps() in petsc/src/mat/impls/aij/mpi/mumps/mumps.c: ... B-factortype = MAT_FACTOR_CHOLESKY; if (A-spd_set A-spd) mumps-sym = 1; else mumps-sym = 2; ... Hong

Re: [petsc-users] DMNetwork usage example

2015-04-08 Thread Hong
See *petsc/src/snes/examples/tutorials/network/pflow* *Hong* On Wed, Apr 8, 2015 at 9:39 PM, Dharmendar Reddy dharmaredd...@gmail.com wrote: Hello, Is there a Fortran or C code example illustrating the usage of DMNetwork ? Thanks Reddy

Re: [petsc-users] Calling single-precision MUMPS from PETSC

2015-06-07 Thread Hong
#endif and replacing d to s for double real: #if defined(PETSC_USE_REAL_SINGLE) #include smumps_c.h #else //#include dmumps_c.h // old #include smumps_c.h // new #endif Hong On Fri, Jun 5, 2015 at 6:26 PM, Evan Um eva...@gmail.com wrote: Dear Barry and PETSC users, I am revisiting

Re: [petsc-users] MUMPS error and superLU error

2015-06-22 Thread Hong
Venkatesh, You may also test superlu_dist, which may use less memory. Hong On Mon, Jun 22, 2015 at 12:43 PM, Barry Smith bsm...@mcs.anl.gov wrote: There is nothing we can really do to help on the PETSc side. I do note from the output REDISTRIB: TOTAL DATA LOCAL/SENT = 328575589

Re: [petsc-users] -pc_mg_monitor

2015-06-11 Thread Hong
David: PETSc library does not have the option '-pc_mg_monitor'. Hong On Thu, Jun 11, 2015 at 6:48 AM, David Scott d.sc...@ed.ac.uk wrote: Hello, I am using MINRES with GAMG and have supplied various options #PETSc Option Table entries: -ksp_max_it 500 -ksp_monitor_true_residual

Re: [petsc-users] MUMPS error and superLU error

2015-05-29 Thread Hong
venkatesh: On Tue, May 26, 2015 at 9:02 PM, Hong hzh...@mcs.anl.gov wrote: 'A serial job in MATLAB for the same matrices takes 60GB. ' Can you run this case in serial? If so, try petsc, superlu or mumps to make sure the matrix is non-singular. B matrix is singular but I get my result

Re: [petsc-users] MUMPS error and superLU error

2015-05-31 Thread Hong
' likely uses B^{-1} (have you read slepc manual?), which could be the source of trouble. Please investigate your model, understand why B is singular; if there is a way to dump null space before submitting large size simulation. Hong On Sun, May 31, 2015 at 8:36 AM, Dave May dave.mayhe

Re: [petsc-users] QCG and KSPSetPCSide

2015-06-29 Thread Hong
://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCICC.html#PCICC: D = L^T, implemented with forward and backward solves. Here L is an incomplete Cholesky factor of H. Hong On Mon, Jun 29, 2015 at 9:22 AM, carol.brick...@awe.co.uk wrote: All, I am trying to use KSPSolve for a QCG

Re: [petsc-users] MUMPS error and superLU error

2015-05-26 Thread Hong
0.00 Solve flops 2.194000e+03 Mflops 5.14 Norm of error 1.18018e-15 iterations 1 Hong On Tue, May 26, 2015 at 9:03 AM, venkatesh g venkateshg...@gmail.com wrote: I posted a while ago in MUMPS forums but no one seems to reply. I am solving a large generalized Eigenvalue problem. I am

Re: [petsc-users] SuperLU MPI-problem

2015-08-03 Thread Hong
sets matinput=DISTRIBUTED as default when using more than one processes. Did you either use '-mat_superlu_dist_parsymbfact' for sequential run or set matinput=GLOBAL for parallel run? I'll add an error flag for these use cases. Hong On Mon, Aug 3, 2015 at 9:17 AM, Xiaoye S. Li x...@lbl.gov wrote

Re: [petsc-users] petsc KLU

2015-08-17 Thread Hong
not have experience on SuiteSparse. Testing MUMPS is worth it as well. Hong Hi Thank you for your answer. I was asking help because I find LU factorization 2-3 times faster than KLU. According to my problem size (200*200) and type (power system simulation), I should get almost the same

Re: [petsc-users] TSSUNDIALS

2015-06-30 Thread Hong
Jed, Do we support DAE with sundials? Hong On Tue, Jun 30, 2015 at 12:11 PM, Jed Brown j...@jedbrown.org wrote: Hasan, Fahad mhas...@vols.utk.edu writes: Hello PETSc team, Do you have any example for solving ODE/DAE using TSSUNDIALS solver? Run any TS example with -ts_type sundials.

Re: [petsc-users] TSSUNDIALS

2015-06-30 Thread Hong
petsc/src/ts/examples/tutorials/ex13.c for ODE using TSSUNDIALS solver. We do not have interface for solving DAE with TSSUNDIALS. Hong On Tue, Jun 30, 2015 at 12:03 PM, Hasan, Fahad mhas...@vols.utk.edu wrote: Hello PETSc team, Do you have any example for solving ODE/DAE using TSSUNDIALS

Re: [petsc-users] Can't expand MemType 1: jcol 16104

2015-07-29 Thread Hong
address Satish's request, we'll update petsc interface to this version of superlu_dist. Anthony: Please download the latest superlu_dist-v4.1, then configure petsc with '--download-superlu_dist=superlu_dist_4.1.tar.gz' Hong On Tue, Jul 28, 2015 at 11:11 AM, Satish Balay ba...@mcs.anl.gov wrote

[petsc-users] Fwd: SuperLU MPI-problem

2015-07-29 Thread Hong
superlu_dist v4.1 2. remove existing PETSC_ARCH directory, then configure petsc with '--download-superlu_dist=superlu_dist_4.1.tar.gz' 3. build petsc Let us know if the issue remains. Hong -- Forwarded message -- From: Xiaoye S. Li x...@lbl.gov Date: Wed, Jul 29, 2015 at 2:24 PM

Re: [petsc-users] Can't expand MemType 1: jcol 16104

2015-07-28 Thread Hong
=13.738475134194639, LUstruct=0x9203c8, grid=0x9202c8, stat=0x7fff9cd84880, info=0x7fff9cd848bc) at pzgstrf.c:1308 if (recv_req[0] != MPI_REQUEST_NULL) { -- MPI_Wait (recv_req[0], status); We will update petsc interface to superlu_dist v4.1. Hong On Mon, Jul 27

Re: [petsc-users] I am wondering if there is a way to implement SPMM

2015-08-05 Thread Hong
. Hong On Wed, Aug 5, 2015 at 4:42 AM, Cong Li solvercorle...@gmail.com wrote: Hi I tried the method you suggested. However, I got the error message. My code and message are below. K is the big matrix containing column matrices. code: call MatGetArray(K,KArray,KArrayOffset,ierr) call

Re: [petsc-users] I am wondering if there is a way to implement SPMM

2015-08-05 Thread Hong
? If not, replace MatCreateDense() with MatMatMult(A,Km(stepIdx-1),MAT_INITIAL_MATRIX,...). Is matrix A dense or sparse? Hong On Wed, Aug 5, 2015 at 9:43 AM, Cong Li solvercorle...@gmail.com wrote: Hong, Thanks for your answer. However, in my problem, I have a pre-allocated matrix K, and its

Re: [petsc-users] SuperLU MPI-problem

2015-08-03 Thread Hong
SamePattern_SameRowPerm I do not understand why your code uses matrix input mode = global. Hong *From:* Hong [mailto:hzh...@mcs.anl.gov] *Sent:* den 3 augusti 2015 16:46 *To:* Xiaoye S. Li *Cc:* Ülker-Kaustell, Mahir; Hong; PETSc users list *Subject:* Re: [petsc-users] SuperLU MPI-problem

Re: [petsc-users] SIGSEGV in Superlu_dist

2015-08-11 Thread Hong
Anthony, I pushed a fix https://bitbucket.org/petsc/petsc/commits/ceeba3afeff0c18262ed13ef92e2508ca68b0ecf Once it passes our nightly tests, I'll merge it to petsc-maint, then petsc-dev. Thanks for reporting it! Hong On Mon, Aug 10, 2015 at 4:27 PM, Barry Smith bsm...@mcs.anl.gov wrote

Re: [petsc-users] I am wondering if there is a way to implement SPMM

2015-08-05 Thread Hong
,..) Hong On Wed, Aug 5, 2015 at 8:56 PM, Cong Li solvercorle...@gmail.com wrote: The entire source code files are attached. Also I copy and paste the here in this email thanks program test implicit none #include finclude/petscsys.h #include finclude/petscvec.h #include finclude

Re: [petsc-users] SIGSEGV in Superlu_dist

2015-08-10 Thread Hong
I'll fix this in the release if no one has done it yet. Hong On Mon, Aug 10, 2015 at 4:27 PM, Barry Smith bsm...@mcs.anl.gov wrote: Anthony, This crash is in PETSc code before it calls the SuperLU_DIST numeric factorization; likely we have a mistake such as assuming a process has

Re: [petsc-users] SIGSEGV in Superlu_dist

2015-08-12 Thread Hong
. This would enable petsc solvers, as well as other packages. Again, thanks for bug reporting. Hong On Tue, Aug 11, 2015 at 1:33 PM, Satish Balay ba...@mcs.anl.gov wrote: yes - the patch will be in petsc 3.6.2. However - you can grab the patch right now - and start using it If using a 3.6.1

Re: [petsc-users] I am wondering if there is a way to implement SPMM

2015-08-06 Thread Hong
Barry: Hong, we want to reuse the space in the Km(stepIdx-1) from which it was created which means that MAT_INITIAL_MATRIX cannot be used. Since the result is always dense it is not the difficult case when a symbolic computation needs to be done initially so, at least in theory, he should

Re: [petsc-users] I am wondering if there is a way to implement SPMM

2015-08-06 Thread Hong
Cong: Hong, Sure. I want to extend the Krylov subspace by step_k dimensions by using monomial, which can be defined as K={Km(1)m Km(2), ..., Km(step_k)} ={Km(1), AKm(1), AKm(2), ... , AKm(step_k-1)} ={R, AR, A^2R, ... A^(step_k-1)R} A subspace with dense matrices as basis? How

Re: [petsc-users] Retrieving the L U factors

2015-07-27 Thread Hong
Zin: See petsc/src/mat/examples/tests/ex130.c petsc/src/ksp/ksp/examples/tutorials/ex52.c Hong Hi I would like to know how I can retrieve the lower triangular matrix (possibly permuted or preferably the inverse of L) and the upper triangular matrix (preferably the inverse of U) from the LU

Re: [petsc-users] Can't expand MemType 1: jcol 16104

2015-07-27 Thread Hong
per process 1 ... I realize that I use superlu_dist v4.0. Would v4.1 works? I'll give it a try tomorrow. Hong On Mon, Jul 27, 2015 at 1:25 PM, Anthony Paul Haas a...@email.arizona.edu wrote: Hi Hong, No that is not the correct matrix. Note that I forgot to mention that it is a complex matrix

Re: [petsc-users] SuperLU MPI-problem

2015-07-22 Thread Hong
*/ We do not change anything else. Hong On Wed, Jul 22, 2015 at 2:19 PM, Xiaoye S. Li x...@lbl.gov wrote: I am trying to understand your problem. You said you are solving Naviers equation (elastodynamics) in the frequency domain, using finite element discretization. I wonder why you have about

Re: [petsc-users] Can't expand MemType 1: jcol 16104

2015-07-24 Thread Hong
to experiment your matrix on a target machine to find out. Hong Subroutine HowBigLUCanBe(rank) IMPLICIT NONE integer(i4b),intent(in) :: rank integer(i4b):: i,ct real(dp):: begin,endd complex(dpc):: sigma

Re: [petsc-users] PETSC MPI error on PetscGatherMessageLengths2

2015-07-14 Thread Hong
MatTransposeMatMult() using petsc/src/mat/examples/tests/ex94.c Hong I'm running some code written by myself, using PETSC with MPI. It runs fine with less than or equal to 12 cores. However, if I ran it with 16 cores, it gives me an error. By looking at the error message, it seems

Re: [petsc-users] Must call MatXXXSetPreallocation() or MatSetUp() on argument 1 mat before MatGetFactorAvailable()

2015-07-15 Thread Hong
);CHKERRQ(ierr); ex33.c:ierr = MatLoad(A,viewer);CHKERRQ(ierr); ex33.c: ierr = MatLoad(B,viewer);CHKERRQ(ierr); ex37.c: ierr = MatLoad(A,fd);CHKERRQ(ierr); ex43.c: ierr = MatLoad(A,fd);CHKERRQ(ierr); ex6.c: ierr = MatLoad(A,fd);CHKERRQ(ierr); ex7.c: ierr = MatLoad(A,fd);CHKERRQ(ierr); Hong

Re: [petsc-users] SuperLU MPI-problem

2015-07-20 Thread Hong
crash in the 1st symbolic factorization? In your case, matrix data structure stays same when omega changes, so you only need to do one matrix symbolic factorization and reuse it. 3. Use a machine that gives larger memory. Hong Dear Petsc-Users, I am trying to use PETSc to solve a set of linear

Re: [petsc-users] Must call MatXXXSetPreallocation() or MatSetUp() on argument 1 mat before MatGetFactorAvailable()

2015-07-15 Thread Hong
Mehrzad : The error occurs at MatCreateNormal(A,N), a function rarely used and not well tested. We will fix it. Do you need this function? Hong Hello everyone, I'm really new to Petsc and when I try to run ksp/ksp/examples/tutorials/ex27 I get this error [0]PETSC ERROR: Object

Re: [petsc-users] issues with sparse direct solvers

2015-08-25 Thread Hong
Gideon: -mat_mumps_icntl_4 0: ICNTL(4): level of printing (0 to 4) (None) This is for algorithmic diagnosis, not for regular runs. Use default '0' for it. Hong On Tue, Aug 25, 2015 at 9:06 AM, Gideon Simpson gideon.simp...@gmail.com wrote: Regarding the MUMPS issue, I’m not sure

Re: [petsc-users] MatPtAP for involving MPIDENSE and MPIAIJ matrices

2015-10-21 Thread Hong
is an MPIDENSE matrix and A is an MPIAIJ matrix. Let us know if you see any bug or performance issues. Hong On Fri, Oct 16, 2015 at 10:25 AM, Jed Brown <j...@jedbrown.org> wrote: > Hong <hzh...@mcs.anl.gov> writes: > > > Jed: > >> > >> > >> > I plan

Re: [petsc-users] ILU preconditioner hangs with some zero elements on the diagonal

2015-10-27 Thread Hong
< 1.e-12 Is this the same matrix as you mentioned? Hong > > > On Tue, Oct 27, 2015 at 9:10 AM, Matthew Knepley <knep...@gmail.com> > wrote: > > On Tue, Oct 27, 2015 at 9:06 AM, Gary Rebt <gary.r...@gmx.ch[ > gary.r...@gmx.ch]> wrote: > > Dear petsc

Re: [petsc-users] ILU preconditioner hangs with some zero elements on the diagonal

2015-10-27 Thread Hong
Matt: > On Tue, Oct 27, 2015 at 11:13 AM, Hong <hzh...@mcs.anl.gov> wrote: > >> Gary : >> I tested your mat.bin using >> petsc/src/ksp/ksp/examples/tutorials/ex10.c >> ./ex10 -f0 $D/mat.bin -rhs 0 -ksp_monitor_true_residual -ksp_view >> ... >> M

Re: [petsc-users] ILU preconditioner hangs with some zero elements on the diagonal

2015-10-27 Thread Hong
Object: 1 MPI processes type: ilu ILU: out-of-place factorization ... Hong On Tue, Oct 27, 2015 at 12:36 PM, Hong <hzh...@mcs.anl.gov> wrote: > Matt: > >> On Tue, Oct 27, 2015 at 11:13 AM, Hong <hzh...@mcs.anl.gov> wrote: >> >>> Gary : >>> I

Re: [petsc-users] ILU preconditioner hangs with some zero elements on the diagonal

2015-10-27 Thread Hong
norm 2.802972716423e+03 2 KSP Residual norm 2.039112137210e+03 ... 24 KSP Residual norm 2.666350543810e-02 Number of iterations = 24 Residual norm 0.0179698 Hong On Tue, Oct 27, 2015 at 1:50 PM, Barry Smith <bsm...@mcs.anl.gov> wrote: > > > On Oct 27, 2015, at 12:40 PM, Hong <

Re: [petsc-users] [SLEPc] any restriction on the calling order of EPS functions?

2015-10-27 Thread Hong
Denis: Your code looks fine to me. There are examples under slepc/src/eps/examples/tutorials using ST with SHELL, e.g., ex10.c Hong Dear developers, > > I wonder if there are any restriction (apart from obvious) on the calling > order of EPS functions? > Is the following logic corre

Re: [petsc-users] SuperLU_dist computes rubbish

2015-11-16 Thread Hong
on it, and forgot to check '-ksp_converged_reason'. However, superlu_dist does not report zero pivot, might simply 'exit'. I'll contact Sherry about it. Hong > > The matrix has a zero pivot with the nd ordering > > $ ./ex10 -pc_type lu -ksp_monitor_true_residual -f0 ~/Downloads/mat.

Re: [petsc-users] SuperLU_dist computes rubbish

2015-11-15 Thread Hong
uperlu -mat_superlu_conditionnumber Recip. condition number = 1.137938e-03 Number of iterations = 1 Residual norm < 1.e-12 As you see, matrix is well-conditioned. Why is it so sensitive to matrix ordering? Hong Using attached petsc4py code, matrix and right-hand side, SuperLU_dist > returns t

Re: [petsc-users] PetscOptionsGetString Not Finding Option

2015-11-01 Thread Hong
Jared : Either call KSPSetPCSide() or change const char name[] = "-ksp_pc_side" to a non-petsc option name, e.g., "-my_ksp_pc_side". Hong Hello, > I am trying to use PetscOptionsGetString to retrieve the value of an > option in the options database, but the

Re: [petsc-users] CG+GAMG convergence issues in GHEP Krylov-Schur for some MPI runs

2015-11-06 Thread Hong
convergence behavior. Hong After running in debug mode it seems that the GAMG solver indeed did not > converge, however throwing the error leads to SIGABRT (backtrace and frames > are below). > It is still very suspicious why would solving for (unchanged) mass matrix > wouldn't con

Re: [petsc-users] MatPtAP for involving MPIDENSE and MPIAIJ matrices

2015-10-16 Thread Hong
e+00 4.e+00 5.e+00 5.e+00 5.e+00 5.e+00 5.e+00 i.e., elemental and petsc dense matrices have same ownership. If there is no data movement for MatConvert(), then it would be easier to use elemental. Hong

Re: [petsc-users] MatPtAP for involving MPIDENSE and MPIAIJ matrices

2015-10-15 Thread Hong
I plan to implement MatTransposeMatMult_MPIDense_MPIDense via 1. add MatTransposeMatMult_elemental_elemental() 2. C_dense = P_dense^T * B_dense via MatConvert_dense_elemental() and MatConvert_elemental_dense() Let me know if you have better suggestions. Hong On Thu, Oct 15, 2015 at 1:49 PM

Re: [petsc-users] Parallel 3d decomposed FFT

2015-09-10 Thread Hong
il. > But is there any way I can use petsc to implement a 3d decomposed FFT? > I guess you do parallel 3D real transform, which is not supported by FFTW. We are not experts on FFT. You have to search external packages that implement it. Hong

Re: [petsc-users] MatMatMult vs series of MatMult

2015-09-16 Thread Hong
rs in > a MATMPIDENSE matrix; or > 2. Store the vectors in a MATMPIDENSE matrix and perform a MatMatMult > operation. > 2 would be more efficient. Hong

Re: [petsc-users] Trivial parallelizing in SLEPc

2015-09-23 Thread Hong
multiplications. If I provide the correct > operations when constructing my MatShell, can I expect the FEAST algorithm > to compute each contour point on a different process? > Slepc developer might answer this question. Hong

Re: [petsc-users] Error reported by MUMPS in numerical factorization phase

2015-12-02 Thread Hong
In this way, mumps would dump out more information. > > Then I tried the same simulation on another machine using the same number > of processors, it does not fail. > Does this machine have larger memory? Hong

Re: [petsc-users] Error reported by MUMPS in numerical factorization phase

2015-12-02 Thread Hong
> I do not think you need to change this part of code. Does you code check convergence at each time step? Hong > > > On 15-12-02 08:39 AM, Hong wrote: > > Danyang : >> >> My code fails due to the error in external library. It works fine for the >> previous 2000

Re: [petsc-users] SuperLU convergence problem (More test)

2015-12-08 Thread Hong
168 - 172, got Recip. condition number = 1.548816e-12. You need check your model to understand why the matrices are so ill-conditioned. Hong Hi Hong, > > Sorry to bother you again. The modified code works much better than before > using both superlu or mumps. However, it still encounter

Re: [petsc-users] SuperLU convergence problem

2015-12-03 Thread Hong
Danyang : Further testing a_flow_check_168.bin, ./ex10 -f0 /Users/Hong/Downloads/matrix_and_rhs_bin/a_flow_check_168.bin -rhs /Users/Hong/Downloads/matrix_and_rhs_bin/x_flow_check_168.bin -pc_type lu -pc_factor_mat_solver_package superlu -ksp_monitor_true_residual -mat_superlu_conditionnumber

Re: [petsc-users] SuperLU convergence problem

2015-12-03 Thread Hong
Danyang: Using petsc/src/ksp/ksp/examples/tutorials/ex10.c, I tested a_flow_check_168.bin mpiexec -n 4 ./ex10 -f0 /Users/Hong/Downloads/matrix_and_rhs_bin/a_flow_check_168.bin -rhs /Users/Hong/Downloads/matrix_and_rhs_bin/x_flow_check_168.bin -pc_type lu -pc_factor_mat_solver_package superlu_dist

Re: [petsc-users] SuperLU convergence problem (More test)

2015-12-07 Thread Hong
Danyang: Add 'call MatSetFromOptions(A,ierr)' to your code. Attached below is ex52f.F modified from your ex52f.F to be compatible with petsc-dev. Hong Hello Hong, > > Thanks for the quick reply and the option "-mat_superlu_dist_fact > SamePattern" works like a charm, if I u

Re: [petsc-users] SuperLU convergence problem (More test)

2015-12-07 Thread Hong
est, bye Sherry may tell you why SamePattern_SameRowPerm cause the difference here. Best on the above experiments, I would set following as default '-mat_superlu_diagpivotthresh 0.0' in petsc/superlu interface. '-mat_superlu_dist_fact SamePattern' in petsc/superlu_dist interface. H

Re: [petsc-users] SuperLU convergence problem

2015-12-03 Thread Hong
/results-check.tar.gz?dl=0>* > Can you send us matrix in petsc binary format? e.g., call MatView(M, PETSC_VIEWER_BINARY_(PETSC_COMM_WORLD)) or '-ksp_view_mat binary' Hong > > > Below is a summary of the norm from the three solvers at timestep 29, > newton iteration 1 to 5. >

Re: [petsc-users] SPRNG package

2015-12-11 Thread Hong
as written many years ago. It is for parallel computation. Students contributed an example at petsc/src/sys/classes/random/examples/tutorials/ex2.c Very few users have ever used this interface. If you encounter any problem, please report to us. Hong

Re: [petsc-users] SPRNG package

2015-12-11 Thread Hong
Barry : > > > there is a comment: > > > >This is NOT currently using a parallel random number generator. Sprng > does have > >an MPI version we should investigate. > Shall we remove this comment? Hong > > >> On Dec 11, 2015, at 11:30 AM, Hong

Re: [petsc-users] KSPSetUp with PETSc/MUMPS

2016-05-26 Thread Hong
I'll investigate this - had a day off since yesterday. Hong On Thu, May 26, 2016 at 12:04 PM, Barry Smith <bsm...@mcs.anl.gov> wrote: > > Hong needs to run with this matrix and add appropriate error checkers in > the matrix routines to detect "incomplete" matrices an

Re: [petsc-users] KSPSetUp with PETSc/MUMPS

2016-05-27 Thread Hong
Satish, I tested your fix on ex51f.F90 (modified from build_nullbasis_petsc_mumps.F90) --it gives clean results with valgrind. Shall you patch it to petsc-maint? I also like add ex51f.F90 (contributed by Constantin) to petsc/src/ksp/ksp/examples/tests/. Hong On Thu, May 26, 2016 at 5:15 PM

Re: [petsc-users] KSPSetUp with PETSc/MUMPS

2016-05-26 Thread Hong
of MATMPIAIJ/MATMPIDENSE MATAIJ wraps MATSEQAIJ and MATMPIAIJ. 2) MatConvert(x, MATMPIAIJ, MAT_REUSE_MATRIX, x,ierr) -> MatConvert(x, MATMPIAIJ, MAT_INPLACE_MATRIX, x,ierr) see http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatConvert.html Hong On Thu, May 26, 2016 at 3:05 PM, Satish Ba

Re: [petsc-users] Receiving DIVERGED_PCSETUP_FAILED

2016-06-22 Thread Hong
16 1.0 6.4163e-01 MatLUFactorSym 1 1.0 2.4772e+00 MatLUFactorNum 1 1.0 8.6419e-01 However, petsc only interfaces with sequential mkl_pardiso. Did you get results in parallel or sequential? Hong > > > > > -- > *From:* Fara

Re: [petsc-users] ODE Solver on multiple cores

2016-01-13 Thread Hong
Fahad: Run your code with '-ts_view' to see what solvers being used for sequential and parallel runs. Hong Hello, > > > > I have written a code to solve a simple differential equation (x’’+x’+6x=0 > with initial values, x(0)=2, x’(0)=3). It works well on a single core and > pro

  1   2   3   4   5   6   7   8   9   >