Hi Riccardo,
I have done this with helloworld.m, and info is always 0, which means no MPI
error. helloworld.m and mc_example2.m both work fine using the MPI_*
functions that don't use octave_comm.h, so I guess that the problem is
there. I would be happy to run any test code that could help to clear it up.
Cheers, Michael
On Sun, Nov 29, 2009 at 10:09 AM, Riccardo Corradini <
riccardocorrad...@yahoo.it> wrote:
> Hi Michael.
> There is a last thing you may do.
> Try to print the info flag within the source code and outside. In mpi.h
> there is the list of errors and you will know more ... if is an MPI problem
> or the public property of the octave_comm holding the pointer
> MPI_COMM_WORLD.
> You may start with hello world .. it will be fast to print the error.
> If you will have always MPI_SUCCESS the problem is only within octave-comm
> class that needs probably a destructor.
> C++ is not my field too, but on the other hand I am learning a lot from my
> mistakes.
> Thanks a lot
> Riccardo
> --- *Dom 29/11/09, Michael Creel <michael.cr...@uab.es>* ha scritto:
>
>
> Da: Michael Creel <michael.cr...@uab.es>
> Oggetto: Re: example octave_comm_test updated
> A: "Riccardo Corradini" <riccardocorrad...@yahoo.it>
> Cc: "octave-forge list" <octave-dev@lists.sourceforge.net>
> Data: Domenica 29 novembre 2009, 08:50
>
>
> Hi Riccardo,
> I have tried it using Open MPI 1.3.3 compiled from source, and I get
> the same results as using Open MPI 1.3.2 from Ubuntu repos. I don't
> think that the difference in Open MPI versions can be the cause,
> because it has been stable for a while for this simple stuff on the
> common architectures. Using the versions of MPI_* in SVN, both
> helloworld.m and mc_example2.m do what they're expected to, but
> segfault when finishing up. Using the old versions of MPI_*, from
> before the communicator stuff, there is no segfault.
>
> My conclusion is that there is some problem in the communicator
> functions. I'll try to figure it out, but this is not an area I know
> about.
>
> Cheers, M.
>
> On Sat, Nov 28, 2009 at 8:16 PM, Riccardo Corradini
> <riccardocorrad...@yahoo.it<http://mc/compose?to=riccardocorrad...@yahoo.it>>
> wrote:
> >
> > Michael,
> > I have no problems with helloworld but I use 1.3.3 stable openmpi try to
> compile it form source ( look at my explanations in the package on configure
> options) I had no problems with all the examples.
> >
> http://www.open-mpi.org/software/ompi/v1.3/downloads/openmpi-1.3.3.tar.bz2
> > I will tell you more about my configuration when I will be back in my
> office on Monday.
> > Bests
> > Riccardo
> >
> > --- Sab 28/11/09, Michael Creel
> > <michael.cr...@uab.es<http://mc/compose?to=michael.cr...@uab.es>>
> ha scritto:
> >
> > Da: Michael Creel
> > <michael.cr...@uab.es<http://mc/compose?to=michael.cr...@uab.es>
> >
> > Oggetto: Re: example octave_comm_test updated
> > A: "Riccardo Corradini"
> > <riccardocorrad...@yahoo.it<http://mc/compose?to=riccardocorrad...@yahoo.it>
> >
> > Cc: "Jaroslav Hajek"
> > <high...@gmail.com<http://mc/compose?to=high...@gmail.com>>,
> "octave-forge list"
> <octave-dev@lists.sourceforge.net<http://mc/compose?to=octave-...@lists.sourceforge.net>
> >
> > Data: Sabato 28 novembre 2009, 16:10
> >
> > Hi Riccardo,
> > I tried running the latest helloworld.m which uses the new comm stuff,
> and I get the same errors on exit. I'm posting the output below, in case
> it's helpful. Please let me know if I can provide any other useful
> information.
> > Cheers, Michael
> >
> > mich...@yosemite:~/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/inst$
> mpirun -np 2 octave -q --eval helloworld
> > We are at rank 0 that is master etc..
> > Greetings from process: 1!
> > *** glibc detected *** octave: double free or corruption (fasttop):
> 0x0000000002c24440 ***
> > ======= Backtrace: =========
> > /lib/libc.so.6[0x7f576b757dd6]
> > /lib/libc.so.6(cfree+0x6c)[0x7f576b75c70c]
> > /usr/lib/libstdc++.so.6(_ZNSsD1Ev+0x39)[0x7f576bf930c9]
> > /lib/libc.so.6(exit+0xe2)[0x7f576b71ac12]
> > /usr/lib/octave-3.2.2/liboctinterp.so(octave_main+0xe1c)[0x7f57720a49ac]
> > /lib/libc.so.6(__libc_start_main+0xfd)[0x7f576b700abd]
> > octave[0x400879]
> > ======= Memory map: ========
> > 00400000-00401000 r-xp 00000000 08:04 142133
> /usr/bin/octave-3.2.2
> > 00600000-00601000 r--p 00000000 08:04 142133
> /usr/bin/octave-3.2.2
> > 00601000-00602000 rw-p 00001000 08:04 142133
> /usr/bin/octave-3.2.2
> > 02351000-02c34000 rw-p 00000000 00:00 0
> [heap]
> > 418a3000-418a5000 rwxp 00000000 00:0f 1382
> /dev/zero
> > 7f5758000000-7f5758021000 rw-p 00000000 00:00 0
> > 7f5758021000-7f575c000000 ---p 00000000 00:00 0
> > 7f575ca88000-7f575ca9a000 r-xp 00000000 08:04 288805
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Finalize.oct
> > 7f575ca9a000-7f575cc9a000 ---p 00012000 08:04 288805
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Finalize.oct
> > 7f575cc9a000-7f575cc9c000 r--p 00012000 08:04 288805
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Finalize.oct
> > 7f575cc9c000-7f575cc9d000 rw-p 00014000 08:04 288805
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Finalize.oct
> > 7f575cc9d000-7f575ccd0000 r-xp 00000000 08:04 288809
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Send.oct
> > 7f575ccd0000-7f575ced0000 ---p 00033000 08:04 288809
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Send.oct
> > 7f575ced0000-7f575ced3000 r--p 00033000 08:04 288809
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Send.oct
> > 7f575ced3000-7f575ced4000 rw-p 00036000 08:04 288809
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Send.oct
> > 7f575ced4000-7f575ceef000 r-xp 00000000 08:04 288803
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_size.oct
> > 7f575ceef000-7f575d0ef000 ---p 0001b000 08:04 288803
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_size.oct
> > 7f575d0ef000-7f575d0f1000 r--p 0001b000 08:04 288803
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_size.oct
> > 7f575d0f1000-7f575d0f2000 rw-p 0001d000 08:04 288803
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_size.oct
> > 7f5762be5000-7f5762c00000 r-xp 00000000 08:04 285390
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_rank.oct
> > 7f5762c00000-7f5762e00000 ---p 0001b000 08:04 285390
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_rank.oct
> > 7f5762e00000-7f5762e02000 r--p 0001b000 08:04 285390
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_rank.oct
> > 7f5762e02000-7f5762e03000 rw-p 0001d000 08:04 285390
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_rank.oct
> > 7f5762e03000-7f5762e1d000 r-xp 00000000 08:04 288820
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/octave_comm_make.oct
> > 7f5762e1d000-7f576301d000 ---p 0001a000 08:04 288820
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/octave_comm_make.oct
> > 7f576301d000-7f576301f000 r--p 0001a000 08:04 288820
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/octave_comm_make.oct
> > 7f576301f000-7f5763020000 rw-p 0001c000 08:04 288820
> /home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/octave_comm_make.oct
> > 7f5765697000-7f5765699000 r-xp 00000000 08:04 548
> /lib/libutil-2.10.1.so
> > 7f5765699000-7f5765898000 ---p 00002000 08:04 548
> /lib/libutil-2.10.1.so
> > 7f5765898000-7f5765899000 r--p 00001000 08:04 548
> /lib/libutil-2.10.1.so
> > 7f5765899000-7f576589a000 rw-p 00002000 08:04 548
> /lib/libutil-2.10.1.so
> > 7f576589a000-7f57658e7000 r-xp 00000000 08:04 140193
> /usr/lib/openmpi/lib/libopen-pal.so.0.0.0
> > 7f57658e7000-7f5765ae7000 ---p 0004d000 08:04 140193
> /usr/lib/openmpi/lib/libopen-pal.so.0.0.0
> > 7f5765ae7000-7f5765ae8000 r--p 0004d000 08:04 140193
> /usr/lib/openmpi/lib/libopen-pal.so.0.0.0
> > 7f5765ae8000-7f5765aea000 rw-p 0004e000 08:04 140193
> /usr/lib/openmpi/lib/libopen-pal.so.0.0.0
> > 7f5765aea000-7f5765b0d000 rw-p 00000000 00:00 0
> > 7f5765b0d000-7f5765b50000 r-xp 00000000 08:04 140194
> /usr/lib/openmpi/lib/libopen-rte.so.0.0.0
> > 7f5765b50000-7f5765d50000 ---p 00043000 08:04 140194
> /usr/lib/openmpi/lib/libopen-rte.so.0.0.0
> > 7f5765d50000-7f5765d51000 r--p 00043000 08:04 140194
> /usr/lib/openmpi/lib/libopen-rte.so.0.0.0
> > 7f5765d51000-7f5765d53000 rw-p 00044000 08:04 140194
> /usr/lib/openmpi/lib/libopen-rte.so.0.0.0
> > 7f5765d53000-7f5765d55000 rw-p 00000000 00:00 0
> > 7f5765d55000-7f5765de2000 r-xp 00000000 08:04 140189
> /usr/lib/openmpi/lib/libmpi.so.0.0.0--------------------------------------------------------------------------
> > mpirun noticed that process rank 1 with PID 2599 on node yosemite exited
> on signal 6 (Aborted).
> >
> --------------------------------------------------------------------------
> > 2 total processes killed (some possibly by mpirun during cleanup)
> > panic: Segmentation fault -- stopping myself...
> > attempting to save variables to `octave-core'...
> > panic: Aborted -- stopping myself...
> > attempting to save variables to `octave-core'...
> > panic: attempted clean up apparently failed -- aborting...
> > mich...@yosemite
> :~/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/inst$
>
>
>
------------------------------------------------------------------------------
Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day
trial. Simplify your report design, integration and deployment - and focus on
what you do best, core application coding. Discover what's new with
Crystal Reports now. http://p.sf.net/sfu/bobj-july
_______________________________________________
Octave-dev mailing list
Octave-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/octave-dev