It's merrily passing all my MTT tests, so it appears to be fine for me.

It would help if you provided *some* information along with these reports - 
like how was this configured, what environment are you running under, how many 
nodes were you using, etc. Otherwise, it's a totally useless report.


On Jun 1, 2014, at 9:04 PM, Ralph Castain <r...@open-mpi.org> wrote:

> I'm afraid that tells me absolutely nothing.
> 
> 
> On Jun 1, 2014, at 8:50 PM, Mike Dubman <mi...@dev.mellanox.co.il> wrote:
> 
>> Hi,
>> The trunk hangs after following commits, seems 3-5,7 can be the ones.
>> Changes
>> Java-oshmem: update examples 
>> Java: update javadoc's install locations 
>> Replace the PML barrier with an RTE barrier for now until we can come up 
>> with a better solution for connectionless BTLs.
>> Per RFC:
>> Per RFC:
>> Cleanup the test so it is MPI correct
>> Cleanup compile issues - missing updates to some plm components and the 
>> slurm ras component
>> The SISGEGV used to print stacktrace, the real reason is a hang.
>> 
>> 22:39:46 + timeout -s SIGSEGV 3m 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/bin/mpirun
>>  -np 8 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/examples/hello_usempi
>> 22:39:47 [vegas12:17297] *** Process received signal ***
>> 22:39:47 [vegas12:17297] Signal: Segmentation fault (11)
>> 22:39:47 [vegas12:17297] Signal code: Address not mapped (1)
>> 22:39:47 [vegas12:17297] Failing at address: (nil)
>> 22:39:47 [vegas12:17297] [ 0] /lib64/libpthread.so.0[0x3937c0f500]
>> 22:39:47 [vegas12:17297] [ 1] /lib64/libc.so.6(fgets+0x2d)[0x3937466f2d]
>> 22:39:47 [vegas12:17297] [ 2] 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/lib/openmpi/mca_rtc_freq.so(+0x1f3f)[0x7ffff41f5f3f]
>> 22:39:47 [vegas12:17297] [ 3] 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/lib/openmpi/mca_rtc_freq.so(+0x279b)[0x7ffff41f679b]
>> 22:39:47 [vegas12:17297] [ 4] 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/lib/libopen-rte.so.0(orte_rtc_base_select+0xd5)[0x7ffff7ddc025]
>> 22:39:47 [vegas12:17297] [ 5] 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/lib/openmpi/mca_ess_hnp.so(+0x4056)[0x7ffff725b056]
>> 22:39:47 [vegas12:17297] [ 6] 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/lib/libopen-rte.so.0(orte_init+0x174)[0x7ffff7d97254]
>> 22:39:47 [vegas12:17297] [ 7] 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/bin/mpirun(orterun+0x863)[0x404613]
>> 22:39:47 [vegas12:17297] [ 8] 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/bin/mpirun(main+0x20)[0x4039e4]
>> 22:39:47 [vegas12:17297] [ 9] 
>> /lib64/libc.so.6(__libc_start_main+0xfd)[0x393741ecdd]
>> 22:39:47 [vegas12:17297] [10] 
>> /scrap/jenkins/scrap/workspace/hpc-ompi-shmem/label/hpc-test-node/ompi_install1/bin/mpirun[0x403909]
>> 22:39:47 [vegas12:17297] *** End of error message ***
>> 
>> M
>> _______________________________________________
>> devel mailing list
>> de...@open-mpi.org
>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/devel
>> Link to this post: 
>> http://www.open-mpi.org/community/lists/devel/2014/06/14934.php
> 

Reply via email to