Bug#625832: python-mpi4py: importing MPI fails

2012-05-03 Thread Juha Jäykkä
 I cannot reproduce this with the version in stable, nor with the
 unstable version built on stable or in an unstable chroot.  I am
 downgrading this bug and mark it unreproducible for now.
 
 How did you compile the above program, what architecture was this on and
 was it on an update unstable system?  Can you try again with an updated
 unstable system?

It seems to work on an up-to-date unstable now. And I am pretty sure it has 
been working for quite a long time for that matter: I do not recall the whole 
issue any longer, so it cannot have plagued me for a long time.

As it is almost a year since I reported the bug, I cannot recall the exact 
specs of the system any longer: I can guess from the bug report which computer 
it was, but not the package versions apart from those present in the report, 
which are missing some thanks to my reporting it against the wrong package 
originally.

But, as I see no changes to any of the openmpi packages since April 2011, it 
is hard to see why anything would have changed. 

From my point of view just mark this solved at the next upload.

-Juha


signature.asc
Description: This is a digitally signed message part.


Bug#625832: python-mpi4py: importing MPI fails

2012-05-02 Thread Michael Banck
severity 625832 important
tags 625832 +unreproducible
thanks

Hi,

On Sun, May 08, 2011 at 12:22:35PM +0100, Juha Jäykkä wrote:
 That result is independent of number of ranks and whether I start the program 
 with or without orterun. The hello.c is very short:
 
 #include mpi.h
 #include stdio.h
 int main(int argc, char **argv) {
   int rc, id;  
   rc=MPI_Init(argc, argv);
   rc=MPI_Comm_rank(MPI_COMM_WORLD, id);
   printf(My id = %i\n, id);
   MPI_Finalize();
 }

I cannot reproduce this with the version in stable, nor with the
unstable version built on stable or in an unstable chroot.  I am
downgrading this bug and mark it unreproducible for now.

How did you compile the above program, what architecture was this on and
was it on an update unstable system?  Can you try again with an updated
unstable system?


Cheers,

Michael



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#625832: python-mpi4py: importing MPI fails

2011-05-08 Thread Yaroslav Halchenko
reassign 625832 libopenmpi1.3
thanks

Since hangs in non-python programs it has to do with a  generic
install/configuration of openmpi, thus reassigning to openmpi.


On Sun, 08 May 2011, Juha Jäykkä wrote:

 Let's see...

  python -c 'import mpi4py.MPI'

 No use, hangs similarly.

  2. where does it hang? (according to strace or may be gdb)?

 Strace and gdb trace attached. Here are the relevant parts of ps and netstat 
 outpot, too.

 orted --hnp --set-sid --report-uri 7 --singleton-died-pipe 8

 tcp0  0 *:60561 *:* LISTEN
   
 6238/orted  
 tcp6   0  0 [::]:34267  [::]:*  LISTEN
   
 6238/orted  

  3. does any other (non-Python) application  built against openmpi ran
 with opterun hangs? (i.e. is it configured properly, etc)

 Hangs in exaxtly the same place:

 #0  0x7690e668 in __poll (fds=0x629ac0, nfds=4, timeout=value 
 optimized out)
 at ../sysdeps/unix/sysv/linux/poll.c:83
 #1  0x776a3921 in ?? () from /usr/lib/libopen-pal.so.0
 #2  0x776a28cf in ?? () from /usr/lib/libopen-pal.so.0
 #3  0x776970b1 in opal_progress () from /usr/lib/libopen-pal.so.0
 #4  0x75a2c7e5 in ?? () from 
 /usr/lib/openmpi/lib/openmpi/mca_rml_oob.so
 #5  0x75a2ccb0 in ?? () from 
 /usr/lib/openmpi/lib/openmpi/mca_rml_oob.so
 #6  0x77915986 in orte_routed_base_register_sync () from 
 /usr/lib/libopen-rte.so.0
 #7  0x760384ce in ?? () from 
 /usr/lib/openmpi/lib/openmpi/mca_routed_binomial.so
 #8  0x77902652 in orte_ess_base_app_setup () from /usr/lib/libopen-
 rte.so.0
 #9  0x75c30fa1 in ?? () from 
 /usr/lib/openmpi/lib/openmpi/mca_ess_singleton.so
 #10 0x778e9ba3 in orte_init () from /usr/lib/libopen-rte.so.0
 #11 0x77b62397 in ?? () from /usr/lib/libmpi.so.0
 #12 0x77b833e0 in PMPI_Init () from /usr/lib/libmpi.so.0
 #13 0x00400916 in main (argc=1, argv=0x7fffddc8) at hello.c:5

 That result is independent of number of ranks and whether I start the program 
 with or without orterun. The hello.c is very short:

 #include mpi.h
 #include stdio.h
 int main(int argc, char **argv) {
   int rc, id;  
   rc=MPI_Init(argc, argv);
   rc=MPI_Comm_rank(MPI_COMM_WORLD, id);
   printf(My id = %i\n, id);
   MPI_Finalize();
 }

 And, as you see from the backtrace, it hangs in MPI_Init().

 Does this mean that the problem is not actually mpi4py, but openmpi (or some 
 such) instead?

  altogether it might be simply a generic misconfiguration of mpi.  If we

 I do not believe this. I never configured openmpi AT ALL, every single 
 openmpi 
 setting is pristine, as they came out of the Debian package. I never had to 
 touch anything - which is precisely the reason I started using openmpi all 
 those years ago, when mpich needed tweaking to get it running at all.

 Cheers,
 Juha
-- 
=--=
Keep in touch www.onerussian.com
Yaroslav Halchenko www.ohloh.net/accounts/yarikoptic



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#625832: python-mpi4py: importing MPI fails

2011-05-06 Thread Juha Jäykkä
Package: python-mpi4py
Version: 1.2.2-2
Severity: grave
Justification: renders package unusable


I am not sure I am blaming the correct package here, but

orterun -n 1 python -c 'import mpi4py.MPI'

hangs and never returns. Regardless of how I start orted.

This used to work about a month ago (my previous upgrade).

It may be that the problem is in python or openmpi as well, since trying
a previously compiled, private version on mpi4py does exactly the same
thing. But I leave that to someone smarter to decide.

Cheers,
Juha

-- System Information:
Debian Release: wheezy/sid
  APT prefers unstable
  APT policy: (500, 'unstable'), (500, 'testing'), (1, 'experimental')
Architecture: amd64 (x86_64)

Kernel: Linux 2.6.38-2-amd64 (SMP w/2 CPU cores)
Locale: LANG=C, LC_CTYPE=C (charmap=UTF-8) (ignored: LC_ALL set to en_GB.UTF-8)
Shell: /bin/sh linked to /bin/bash

Versions of packages python-mpi4py depends on:
ii  libc6 2.13-2 Embedded GNU C Library: Shared lib
ii  libopenmpi1.3 1.4.3-2.1  high performance message passing l
ii  python2.6.6-14   interactive high-level object-orie
ii  python-support1.0.13 automated rebuilding support for P

Versions of packages python-mpi4py recommends:
ii  mpi-default-bin   0.6Standard MPI runtime programs

Versions of packages python-mpi4py suggests:
ii  python-numpy  1:1.4.1-5  Numerical Python adds a fast array

-- no debconf information



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Bug#625832: python-mpi4py: importing MPI fails

2011-05-06 Thread Yaroslav Halchenko
possibly useless, but

1. does it hang if you just do

python -c 'import mpi4py.MPI'

?

2. where does it hang? (according to strace or may be gdb)?

3. does any other (non-Python) application  built against openmpi ran
   with opterun hangs? (i.e. is it configured properly, etc)


altogether it might be simply a generic misconfiguration of mpi.  If we
do not figure it out within a day or two, I would lower severity since
it works for me

On Fri, 06 May 2011, Juha Jäykkä wrote:

 Package: python-mpi4py
 Version: 1.2.2-2
 Severity: grave
 Justification: renders package unusable


 I am not sure I am blaming the correct package here, but

 orterun -n 1 python -c 'import mpi4py.MPI'

 hangs and never returns. Regardless of how I start orted.

-- 
=--=
Keep in touch www.onerussian.com
Yaroslav Halchenko www.ohloh.net/accounts/yarikoptic



-- 
To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org