Ah, of course. Here it is:

Malloc fails for Lnzval_bc_ptr[*][] at line 626 in file zdistribute.c
Malloc fails for Lnzval_bc_ptr[*][] at line 626 in file zdistribute.c
col block 797 col block 1518 Malloc fails for Lnzval_bc_ptr[*][] at line 626 in 
file zdistribute.c
col block 6369 -------------------------------------------------------
Primary job  terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.
-------------------------------------------------------
[0]PETSC ERROR: 
------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 15 Terminate: Some process (or the batch 
system) has told this process to end
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see 
http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to 
find memory corruption errors
[0]PETSC ERROR: likely location of problem given in stack below
[0]PETSC ERROR: ---------------------  Stack Frames 
------------------------------------
[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,
[0]PETSC ERROR:       INSTEAD the line number of the start of the function
[0]PETSC ERROR:       is given.
[0]PETSC ERROR: [0] SuperLU_DIST:pzgssvx_ABglobal line 405 
/packages/petsc-3.6.0/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c
[0]PETSC ERROR: [0] MatLUFactorNumeric_SuperLU_DIST line 282 
/packages/petsc-3.6.0/src/mat/impls/aij/mpi/superlu_dist/superlu_dist.c
[0]PETSC ERROR: [0] MatLUFactorNumeric line 2937 
/packages/petsc-3.6.0/src/mat/interface/matrix.c
[0]PETSC ERROR: [0] PCSetUp_LU line 99 
/packages/petsc-3.6.0/src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: [0] PCSetUp line 944 
/packages/petsc-3.6.0/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [0] KSPSetUp line 247 
/packages/petsc-3.6.0/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: User provided function() line 0 in  unknown file (null)
--------------------------------------------------------------------------
mpiexec detected that one or more processes exited with non-zero status, thus 
causing
the job to be terminated. The first process to do so was:

 Process name: [[40547,1],3]
  Exit code:    255
--------------------------------------------------------------------------

Mahir


________________________________
Mahir Ülker-Kaustell, Kompetenssamordnare, Brokonstruktör, Tekn. Dr, Tyréns AB
010 452 30 82, [email protected]
________________________________

From: Matthew Knepley [mailto:[email protected]]
Sent: den 20 juli 2015 17:35
To: Ülker-Kaustell, Mahir
Cc: petsc-users
Subject: Re: [petsc-users] SuperLU MPI-problem

We cannot say anything without the full error message.

  Matt

On Mon, Jul 20, 2015 at 9:38 AM, 
[email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>> wrote:
Dear Petsc-Users,

I am trying to use PETSc to solve a set of linear equations arising from 
Naviers equation (elastodynamics) in the frequency domain.
The frequency dependency of the problem requires that the system

                             [-omega^2M + K]u = F

where M and K are constant, square, positive definite matrices (mass and 
stiffness respectively) is solved for each frequency omega of interest.
K is a complex matrix, including material damping.

I have written a PETSc program which solves this problem for a small (1000 
degrees of freedom) test problem on one or several processors, but it keeps 
crashing when I try it on my full scale (in the order of 10^6 degrees of 
freedom) problem.

The program crashes at KSPSetUp() and from what I can see in the error 
messages, it appears as if it consumes too much memory.

I would guess that similar problems have occurred in this mail-list, so I am 
hoping that someone can push  me in the right direction…

Mahir







--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

Reply via email to