The example runs fine - if I build & run manually. Something else must
have caused malloc failure lastnight..

Satish

---------

petsc@n-gage:~/petsc.test/src/dm/examples/tests$ env |grep PETSC
PETSC_ARCH=arch-opensolaris-cmplx-pkgs-dbg
PETSC_DIR=/export/home/petsc/petsc.test
petsc@n-gage:~/petsc.test/src/dm/examples/tests$ make ex36
/export/home/petsc/soft/mpich-3.1.3/bin/mpicc -o ex36.o -c -g  
-I/export/home/petsc/petsc.test/include 
-I/export/home/petsc/petsc.test/arch-opensolaris-cmplx-pkgs-dbg/include 
-I/export/home/petsc/soft/mpich-3.1.3/include    `pwd`/ex36.c
/export/home/petsc/soft/mpich-3.1.3/bin/mpicc -g  -o ex36 ex36.o  
-R/export/home/petsc/petsc.test/arch-opensolaris-cmplx-pkgs-dbg/lib 
-L/export/home/petsc/petsc.test/arch-opensolaris-cmplx-pkgs-dbg/lib  -lpetsc 
-R/export/home/petsc/petsc.test/arch-opensolaris-cmplx-pkgs-dbg/lib -lcmumps 
-ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lscalapack -lflapack -lfblas 
-lparmetis -lmetis -lssl -lcrypto -ltriangle -lX11 -lm 
-R/export/home/petsc/soft/mpich-3.1.3/lib 
-L/export/home/petsc/soft/mpich-3.1.3/lib -lmpifort -R/opt/SUNWspro/lib 
-L/opt/SUNWspro/lib -R/opt/SUNWspro/prod/lib -L/opt/SUNWspro/prod/lib 
-R/usr/ccs/lib -L/usr/ccs/lib -lfui -lfai -lfsu -lsunmath -lmtsk -lm -ldl 
-lmpicxx -lmpi 
-R/opt/SUNWspro/lib/rw7:/opt/SUNWspro/lib:/usr/ccs/lib:/lib:/usr/lib -lCstd 
-lCrun  -ldl  -lsocket -lnsl
/usr/bin/rm -f -f ex36.o
petsc@n-gage:~/petsc.test/src/dm/examples/tests$ make runex36_2dp1
petsc@n-gage:~/petsc.test/src/dm/examples/tests$ make runex36_2dp1
petsc@n-gage:~/petsc.test/src/dm/examples/tests$ make runex36_2dp1
petsc@n-gage:~/petsc.test/src/dm/examples/tests$ 



On Fri, 30 Oct 2015, Satish Balay wrote:

> Hm - the error below is from wii (freebsd). But I don't see that in tonights
> builds.
> 
> The opensolaris issues looks different [out of memory]. Thats wierd.
> 
> I'll check if I can reproduce..
> 
> Satish
> 
> On Fri, 30 Oct 2015, Barry Smith wrote:
> 
> > 
> >   The log file is 
> > http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2015/10/30/examples_next_arch-opensolaris-cmplx-pkgs-dbg_n-gage.log
> > 
> > hence the machine name is n-gage and the PETSC_ARCH is 
> > arch-opensolaris-cmplx-pkgs-dbg
> > 
> > 
> > > On Oct 30, 2015, at 2:49 PM, Matthew Knepley <[email protected]> wrote:
> > > 
> > > Cool. Which machine had the error you sent. I can't find it.
> > > 
> > >    Matt
> > > 
> > > On Oct 30, 2015 2:47 PM, "Barry Smith" <[email protected]> wrote:
> > > 
> > >   Satish can provide more details on how you can easily run in the EXACT 
> > > environment where something broke to debug it much faster.  The model is
> > > 
> > > 1)  ssh [email protected]
> > > 
> > > 2) ssh testmachine  (testmachine is always the end part of the name of 
> > > the log file)
> > > 
> > > 3) cd to either  /sandbox/petsc/petsc.test  or /home/petsc/petsc.test 
> > > depending on the machine
> > > 
> > > 4) git fetch
> > > 
> > > 5) git checkout the broken branch
> > > 
> > > 6) set PETSC_ARCH=arch of broken machine, this is in the name of the log 
> > > filename
> > > 
> > > 7) ./config/examples/${PETSC_ARCH}.py
> > > 
> > > 8) build PETSc and debug away.
> > > 
> > >   If you have any trouble with this let Satish and I know. It is the 
> > > intention that debugging on the test machines should be very 
> > > straightforward and not require getting help from anyone or following 
> > > convoluted instructions.
> > > 
> > >   Barry
> > > 
> > > > On Oct 30, 2015, at 2:15 PM, Matthew Knepley <[email protected]> wrote:
> > > >
> > > > I ran it through valgrind on my machine with no problems. Checking logs
> > > >
> > > >    Matt
> > > >
> > > > On Thu, Oct 29, 2015 at 5:09 PM, Barry Smith <[email protected]> wrote:
> > > >
> > > > <   [3] Roots referenced by my leaves, by rank
> > > > < Symmetric gradient null space: PASS
> > > > < Function tests pass for order 0 at tolerance 1e-10
> > > > < Function tests pass for order 0 derivatives at tolerance 1e-10
> > > > ---
> > > > > [2]PETSC ERROR: #1 PetscCommDuplicate() line 178 in 
> > > > > /usr/home/balay/petsc.clone-3/src/sys/objects/tagm.c
> > > > > [2]PETSC ERROR: #2 PetscHeaderCreate_Private() line 60 in 
> > > > > /usr/home/balay/petsc.clone-3/src/sys/objects/inherit.c
> > > > > [2]PETSC ERROR: #3 PetscSFCreate() line 44 in 
> > > > > /usr/home/balay/petsc.clone-3/src/vec/is/sf/interface/sf.c
> > > > > [2]PETSC ERROR: #4 DMPlexDistribute() line 1562 in 
> > > > > /usr/home/balay/petsc.clone-3/src/dm/impls/plex/plexdistribute.c
> > > > > [2]PETSC ERROR: #5 CreateMesh() line 232 in 
> > > > > /usr/home/balay/petsc.clone-3/src/dm/impls/plex/examples/tests/ex3.c
> > > > > [2]PETSC ERROR: #6 main() line 911 in 
> > > > > /usr/home/balay/petsc.clone-3/src/dm/impls/plex/examples/tests/ex3.c
> > > > > [2]PETSC ERROR: PETSc Option Table entries:
> > > > > [2]PETSC ERROR: -dim 3
> > > > > [2]PETSC ERROR: -dm_plex_max_projection_height 2
> > > > > [2]PETSC ERROR: -dm_view ascii::ASCII_INFO_DETAIL
> > > > > [2]PETSC ERROR: -malloc_dump
> > > > > [2]PETSC ERROR: -nox
> > > > > [2]PETSC ERROR: -nox_warning
> > > > > [2]PETSC ERROR: -num_comp 3
> > > > > [2]PETSC ERROR: -petscpartitioner_type simple
> > > > > [2]PETSC ERROR: -petscspace_order 1
> > > > > [2]PETSC ERROR: -petscspace_poly_tensor
> > > > > [2]PETSC ERROR: -qorder 1
> > > > > [2]PETSC ERROR: -simplex 0
> > > > > [2]PETSC ERROR: -test_fe_jacobian
> > > > > [2]PETSC ERROR: -tree
> > > > > [2]PETSC ERROR: ----------------End of Error Message -------send 
> > > > > entire error message to [email protected]
> > > > > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 2
> > > > > [cli_2]: aborting job:
> > > > > application called MPI_Abort(MPI_COMM_WORLD, 1) - process 2
> > > > >
> > > > > ===================================================================================
> > > > > =   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> > > > > =   PID 15231 RUNNING AT wii
> > > > > =   EXIT CODE: 1
> > > > > =   CLEANING UP REMAINING PROCESSES
> > > > > =   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> > > > > ===================================================================================
> > > > /usr/home/balay/petsc.clone-3/src/dm/impls/plex/examples/tests
> > > > Possible problem with with runex3_nonconforming_tensor_3, diffs above
> > > >
> > > >
> > > >
> > > >
> > > > --
> > > > What most experimenters take for granted before they begin their 
> > > > experiments is infinitely more interesting than any results to which 
> > > > their experiments lead.
> > > > -- Norbert Wiener
> > > 
> > 
> > 
> 
> 

Reply via email to