Hi Riccardo,
I'm still seeing problems with this, both for helloworld.m and for
mc_example2.m. This is with Octave 3.2.2 and Open MPI 1.3.2 from  Ubuntu
9.10, AMD64.

Below is the output I get  from helloworld. Please let me know if I can do
anything to generate useful information about this.

Thanks,
Michael


mich...@yosemite:~/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/inst$
mpirun -np 3 octave -q  --eval helloworld
Variables in the current
scope:


  Attr Name        Size                     Bytes  Class
  ==== ====        ====                     =====  =====
       CW         -1x-1                         0  simple

Total is 1 element using 0 bytes

my_rank = 0
p =  3
We are at rank 0 that is master etc..
Variables in the current scope:

  Attr Name        Size                     Bytes  Class
  ==== ====        ====                     =====  =====
       CW         -1x-1                         0  simple

Total is 1 element using 0 bytes

my_rank =  1
p =  3
info for sending is is
info = 0
Variables in the current scope:

  Attr Name        Size                     Bytes  Class
  ==== ====        ====                     =====  =====
       CW         -1x-1                         0  simple

Total is 1 element using 0 bytes

my_rank =  2
p =  3
info for sending is is
info = 0
Greetings from process: 1!
We are at rank 0 that is master etc..
Greetings from process: 2!
*** glibc detected *** octave: double free or corruption (fasttop):
0x00000000017b18c0 ***
======= Backtrace:
=========
/lib/libc.so.6[0x7f78ee49fdd6]

/lib/libc.so.6(cfree+0x6c)[0x7f78ee4a470c]

/usr/lib/libstdc++.so.6(_ZNSsD1Ev+0x39)[0x7f78eecdb0c9]

/lib/libc.so.6(exit+0xe2)[0x7f78ee462c12]

/usr/lib/octave-3.2.2/liboctinterp.so(octave_main+0xe1c)[0x7f78f4dec9ac]

/lib/libc.so.6(__libc_start_main+0xfd)[0x7f78ee448abd]

octave[0x400879]

======= Memory map:
========
00400000-00401000 r-xp 00000000 08:04 142133
/usr/bin/octave-3.2.2
00600000-00601000 r--p 00000000 08:04 142133
/usr/bin/octave-3.2.2
00601000-00602000 rw-p 00001000 08:04 142133
/usr/bin/octave-3.2.2
00edb000-017be000 rw-p 00000000 00:00 0
[heap]
410ba000-410bc000 rwxp 00000000 00:0f 1652
/dev/zero
7f78d8000000-7f78d8021000 rw-p 00000000 00:00
0
7f78d8021000-7f78dc000000 ---p 00000000 00:00
0
7f78df7d0000-7f78df7e2000 r-xp 00000000 08:04 574357
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Finalize.oct

7f78df7e2000-7f78df9e2000 ---p 00012000 08:04 574357
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Finalize.oct

7f78df9e2000-7f78df9e4000 r--p 00012000 08:04 574357
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Finalize.oct

7f78df9e4000-7f78df9e5000 rw-p 00014000 08:04 574357
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Finalize.oct

7f78df9e5000-7f78dfa18000 r-xp 00000000 08:04 574361
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Send.oct

7f78dfa18000-7f78dfc18000 ---p 00033000 08:04 574361
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Send.oct

7f78dfc18000-7f78dfc1b000 r--p 00033000 08:04 574361
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Send.oct

7f78dfc1b000-7f78dfc1c000 rw-p 00036000 08:04 574361
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Send.oct

7f78dfc1c000-7f78dfc37000 r-xp 00000000 08:04 574353
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_size.oct

7f78dfc37000-7f78dfe37000 ---p 0001b000 08:04 574353
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_size.oct

7f78dfe37000-7f78dfe39000 r--p 0001b000 08:04 574353
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_size.oct

7f78dfe39000-7f78dfe3a000 rw-p 0001d000 08:04 574353
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_size.oct

7f78e592d000-7f78e5948000 r-xp 00000000 08:04 574351
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_rank.oct

7f78e5948000-7f78e5b48000 ---p 0001b000 08:04 574351
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_rank.oct

7f78e5b48000-7f78e5b4a000 r--p 0001b000 08:04 574351
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_rank.oct

7f78e5b4a000-7f78e5b4b000 rw-p 0001d000 08:04 574351
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_rank.oct

7f78e5b4b000-7f78e5b65000 r-xp 00000000 08:04 574356
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_Load.oct
7f78e5b65000-7f78e5d65000 ---p 0001a000 08:04 574356
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_Load.oct
7f78e5d65000-7f78e5d67000 r--p 0001a000 08:04 574356
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_Load.oct
7f78e5d67000-7f78e5d68000 rw-p 0001c000 08:04 574356
/home/michael/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/src/MPI_Comm_Load.oct
7f78e83df000-7f78e83e1000 r-xp 00000000 08:04 548
/lib/libutil-2.10.1.so
7f78e83e1000-7f78e85e0000 ---p 00002000 08:04 548
/lib/libutil-2.10.1.so
7f78e85e0000-7f78e85e1000 r--p 00001000 08:04 548
/lib/libutil-2.10.1.so
7f78e85e1000-7f78e85e2000 rw-p 00002000 08:04 548
/lib/libutil-2.10.1.so
7f78e85e2000-7f78e862f000 r-xp 00000000 08:04 140193
/usr/lib/openmpi/lib/libopen-pal.so.0.0.0
7f78e862f000-7f78e882f000 ---p 0004d000 08:04 140193
/usr/lib/openmpi/lib/libopen-pal.so.0.0.0
7f78e882f000-7f78e8830000 r--p 0004d000 08:04 140193
/usr/lib/openmpi/lib/libopen-pal.so.0.0.0
7f78e8830000-7f78e8832000 rw-p 0004e000 08:04 140193
/usr/lib/openmpi/lib/libopen-pal.so.0.0.0
7f78e8832000-7f78e8855000 rw-p 00000000 00:00 0
7f78e8855000-7f78e8898000 r-xp 00000000 08:04 140194
/usr/lib/openmpi/lib/libopen-rte.so.0.0.0
7f78e8898000-7f78e8a98000 ---p 00043000 08:04 140194
/usr/lib/openmpi/lib/libopen-rte.so.0.0.0
7f78e8a98000-7f78e8a99000 r--p 00043000 08:04 140194
/usr/lib/openmpi/lib/libopen-rte.so.0.0.0
7f78e8a99000-7f78e8a9b000 rw-p 00044000 08:04 140194
/usr/lib/openmpi/lib/libopen-rte.so.0.0.0
7f78e8a9b000-7f78e8a9d000 rw-p 00000000 00:00 0
7f78e8a9d000-7f78e8b2a000 r-xp 00000000 08:04 140189
/usr/lib/openmpi/lib/libmpi.so.0.0.0--------------------------------------------------------------------------
mpirun noticed that process rank 2 with PID 29356 on node yosemite exited on
signal 6 (Aborted).
--------------------------------------------------------------------------
2 total processes killed (some possibly by mpirun during cleanup)
panic: Aborted -- stopping myself...
attempting to save variables to `octave-core'...
panic: attempted clean up apparently failed -- aborting...
panic: Aborted -- stopping myself...
attempting to save variables to `octave-core'...
panic: attempted clean up apparently failed -- aborting...
panic: Segmentation fault -- stopping myself...
attempting to save variables to `octave-core'...
mich...@yosemite
:~/Desktop/of/octave/trunk/octave-forge/extra/openmpi_ext/inst$


On Wed, Dec 2, 2009 at 10:35 AM, Riccardo Corradini <
riccardocorrad...@yahoo.it> wrote:

> Hello Michael,
> in the svn  there is a simpler but improved version. In my amd 64 machine
> dual core linux ubuntu jaunty no problem. No segmentation .. no errors with
> the new version.
> I tested it with the new function MPI_Comm_Load. All the examples with
> montercarlo work as well.
> Bests
> Riccardo
>
>
>
> --- *Mar 1/12/09, Michael Creel <michael.cr...@uab.es>* ha scritto:
>
>
> Da: Michael Creel <michael.cr...@uab.es>
> Oggetto: Re: [OctDev] A simple C++ example of a class with MPI_Comunicator
> property
> A: "Riccardo Corradini" <riccardocorrad...@yahoo.it>
> Data: Martedì 1 dicembre 2009, 13:49
>
>
> Hi Riccardo,
> You might find this screenshot amusing. This is a PelicanHPC made using
> Debian Sid, with Octave 3.2.3, and with your MPI bindings. I'm looking
> forward to trying out your fix for the comm stuff.
> Cheers, MIchael
>
> On Tue, Dec 1, 2009 at 12:43 PM, Riccardo Corradini <
> riccardocorrad...@yahoo.it> wrote:
>
>>  Dear Jaroslav
>> after goggling a lot I found a nice example that I converted to fix the
>> MPI_Comunicator problem
>>
>> Here there is the code adjusted from
>> http://www.codeguru.com/cpp/cpp/cpp_mfc/article.php/c4031/
>>
>> there are just two files Comm.h and Comexample.cc
>>
>> #define FILL_COMM(RNAME,LNAME)        \
>>   MPI_Comm RNAME##comm = MPI_LNAME;\
>>
>> #include "mpi.h"
>>
>>
>>
>> #define READ_ONLY 1
>> #define WRITE_ONLY 2
>> #define READ_WRITE 3
>>
>> template <typename Container, typename ValueType, int nPropType>
>> class property
>> {
>> public:
>> property()
>> {
>>   m_cObject = NULL;
>>   Set = NULL;
>>   Get = NULL;
>> }
>> //-- This to set a pointer to the class that contain the
>> //   property --
>> void setContainer(Container* cObject)
>> {
>>   m_cObject = cObject;
>> }
>> //-- Set the set member function that will change the value --
>> void setter(void (Container::*pSet)(ValueType value))
>> {
>>   if((nPropType == WRITE_ONLY) || (nPropType == READ_WRITE))
>>     Set = pSet;
>>   else
>>     Set = NULL;
>> }
>> //-- Set the get member function that will retrieve the value --
>> void getter(ValueType (Container::*pGet)())
>> {
>>   if((nPropType == READ_ONLY) || (nPropType == READ_WRITE))
>>     Get = pGet;
>>   else
>>     Get = NULL;
>> }
>> //-- Overload the '=' sign to set the value using the set
>> //   member --
>> ValueType operator =(const ValueType& value)
>> {
>>   assert(m_cObject != NULL);
>>   assert(Set != NULL);
>>   (m_cObject->*Set)(value);
>>   return value;
>> }
>> //-- To make possible to cast the property class to the
>> //   internal type --
>> operator ValueType()
>> {
>>   assert(m_cObject != NULL);
>>   assert(Get != NULL);
>>   return (m_cObject->*Get)();
>> }
>> private:
>>   Container* m_cObject;  //-- Pointer to the module that
>>                          //   contains the property --
>>   void (Container::*Set)(ValueType value);
>>                          //-- Pointer to set member function --
>>   ValueType (Container::*Get)();
>>                          //-- Pointer to get member function --
>> };
>>
>>
>> class PropTest
>> {
>> public:
>>   PropTest()
>>   {
>>     Comm.setContainer(this);
>>     Comm.setter(&PropTest::setComm);
>>     Comm.getter(&PropTest::getComm);
>>   }
>>   MPI_Comm getComm()
>>   {
>>     return m_nComm;
>>   }
>>   void setComm(MPI_Comm nComm)
>>   {
>>     m_nComm = nComm;
>>   }
>>   property<PropTest,MPI_Comm,READ_WRITE> Comm;
>>
>>
>> private:
>>   MPI_Comm m_nComm;
>> };
>>
>>
>> And the DLD function
>> #include <octave/oct.h>
>> #include "Comm.h"
>> DEFUN_DLD(CommExample,args,,"A simple example using a C++ class holding
>> Comunicator as property")
>> {
>>
>> // Example setting
>>  PropTest SetTest;
>>  SetTest.Comm = MPI_COMM_WORLD;
>> // Example getting
>>  MPI_Comm ex_comm;
>>  ex_comm = SetTest.Comm;
>> // the problem is how could SetTest comunicate with GNU Octave ?
>> }
>>
>> I think the situation here is simpler because we have a simple class
>> holding all pieces of info and perhaps the pointers to this class are easier
>> to handle.. and more universal among different MPI implementations ...
>> What do you think ?
>> Thanks a lot in advance for all your patience and help.
>> Bests
>> Riccardo
>>
>>
>>
>>
>>
>>
>> ------------------------------------------------------------------------------
>> Join us December 9, 2009 for the Red Hat Virtual Experience,
>> a free event focused on virtualization and cloud computing.
>> Attend in-depth sessions from your desk. Your couch. Anywhere.
>> http://p.sf.net/sfu/redhat-sfdev2dev
>>
>> _______________________________________________
>> Octave-dev mailing list
>> Octave-dev@lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/octave-dev
>>
>>
>
>
------------------------------------------------------------------------------
Join us December 9, 2009 for the Red Hat Virtual Experience,
a free event focused on virtualization and cloud computing. 
Attend in-depth sessions from your desk. Your couch. Anywhere.
http://p.sf.net/sfu/redhat-sfdev2dev
_______________________________________________
Octave-dev mailing list
Octave-dev@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/octave-dev

Reply via email to