Thanks for reporting this Frank -- looks like we borked a symbol in the xgrid component in 1.3. It seems that the compiler doesn't complain about the missing symbol; it only shows up when you try to *run* with it. Whoops!

I filed ticket https://svn.open-mpi.org/trac/ompi/ticket/1777 about this issue.


On Jan 23, 2009, at 3:11 PM, Frank Kahle wrote:

I'm running OpenMPI on OS X 4.11. After upgrading to OpenMPI-1.3 I get the following error when submitting a job via XGrid:

dyld: lazy symbol binding failed: Symbol not found: _orte_pointer_array_add
 Referenced from: /usr/local/mpi/lib/openmpi/mca_plm_xgrid.so
 Expected in: flat namespace

Here you'll find ompi_info's output:
[g5-node-1:~] motte% ompi_info
                Package: Open MPI root@ibi.local Distribution
               Open MPI: 1.3
  Open MPI SVN revision: r20295
  Open MPI release date: Jan 19, 2009
               Open RTE: 1.3
  Open RTE SVN revision: r20295
  Open RTE release date: Jan 19, 2009
                   OPAL: 1.3
      OPAL SVN revision: r20295
      OPAL release date: Jan 19, 2009
           Ident string: 1.3
                 Prefix: /usr/local/mpi
Configured architecture: powerpc-apple-darwin8
         Configure host: ibi.local
          Configured by: root
          Configured on: Tue Jan 20 19:45:26 CET 2009
         Configure host: ibi.local
               Built by: root
               Built on: Tue Jan 20 20:49:48 CET 2009
             Built host: ibi.local
             C bindings: yes
           C++ bindings: yes
     Fortran77 bindings: yes (single underscore)
     Fortran90 bindings: yes
Fortran90 bindings size: small
             C compiler: gcc-4.3
    C compiler absolute: /usr/local/bin/gcc-4.3
           C++ compiler: c++-4.3
  C++ compiler absolute: /usr/local/bin/c++-4.3
     Fortran77 compiler: gfortran-4.3
 Fortran77 compiler abs: /usr/local/bin/gfortran-4.3
     Fortran90 compiler: gfortran-4.3
 Fortran90 compiler abs: /usr/local/bin/gfortran-4.3
            C profiling: yes
          C++ profiling: yes
    Fortran77 profiling: yes
    Fortran90 profiling: yes
         C++ exceptions: no
         Thread support: posix (mpi: no, progress: no)
          Sparse Groups: no
 Internal debug support: no
    MPI parameter check: runtime
Memory profiling support: no
Memory debugging support: no
        libltdl support: yes
  Heterogeneous support: no
mpirun default --prefix: no
        MPI I/O support: yes
      MPI_WTIME support: gettimeofday
Symbol visibility support: yes
  FT Checkpoint support: no  (checkpoint thread: no)
          MCA backtrace: darwin (MCA v2.0, API v2.0, Component v1.3)
          MCA paffinity: darwin (MCA v2.0, API v2.0, Component v1.3)
MCA carto: auto_detect (MCA v2.0, API v2.0, Component v1.3)
              MCA carto: file (MCA v2.0, API v2.0, Component v1.3)
MCA maffinity: first_use (MCA v2.0, API v2.0, Component v1.3)
              MCA timer: darwin (MCA v2.0, API v2.0, Component v1.3)
        MCA installdirs: env (MCA v2.0, API v2.0, Component v1.3)
        MCA installdirs: config (MCA v2.0, API v2.0, Component v1.3)
                MCA dpm: orte (MCA v2.0, API v2.0, Component v1.3)
             MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.3)
          MCA allocator: basic (MCA v2.0, API v2.0, Component v1.3)
          MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.3)
               MCA coll: basic (MCA v2.0, API v2.0, Component v1.3)
               MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.3)
               MCA coll: inter (MCA v2.0, API v2.0, Component v1.3)
               MCA coll: self (MCA v2.0, API v2.0, Component v1.3)
               MCA coll: sm (MCA v2.0, API v2.0, Component v1.3)
               MCA coll: tuned (MCA v2.0, API v2.0, Component v1.3)
                 MCA io: romio (MCA v2.0, API v2.0, Component v1.3)
              MCA mpool: fake (MCA v2.0, API v2.0, Component v1.3)
              MCA mpool: rdma (MCA v2.0, API v2.0, Component v1.3)
              MCA mpool: sm (MCA v2.0, API v2.0, Component v1.3)
                MCA pml: cm (MCA v2.0, API v2.0, Component v1.3)
                MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.3)
                MCA pml: v (MCA v2.0, API v2.0, Component v1.3)
                MCA bml: r2 (MCA v2.0, API v2.0, Component v1.3)
             MCA rcache: vma (MCA v2.0, API v2.0, Component v1.3)
                MCA btl: self (MCA v2.0, API v2.0, Component v1.3)
                MCA btl: sm (MCA v2.0, API v2.0, Component v1.3)
                MCA btl: tcp (MCA v2.0, API v2.0, Component v1.3)
               MCA topo: unity (MCA v2.0, API v2.0, Component v1.3)
                MCA osc: pt2pt (MCA v2.0, API v2.0, Component v1.3)
                MCA osc: rdma (MCA v2.0, API v2.0, Component v1.3)
                MCA iof: hnp (MCA v2.0, API v2.0, Component v1.3)
                MCA iof: orted (MCA v2.0, API v2.0, Component v1.3)
                MCA iof: tool (MCA v2.0, API v2.0, Component v1.3)
                MCA oob: tcp (MCA v2.0, API v2.0, Component v1.3)
               MCA odls: default (MCA v2.0, API v2.0, Component v1.3)
                MCA ras: slurm (MCA v2.0, API v2.0, Component v1.3)
MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.3) MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.3)
              MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.3)
                MCA rml: oob (MCA v2.0, API v2.0, Component v1.3)
             MCA routed: binomial (MCA v2.0, API v2.0, Component v1.3)
             MCA routed: direct (MCA v2.0, API v2.0, Component v1.3)
             MCA routed: linear (MCA v2.0, API v2.0, Component v1.3)
                MCA plm: rsh (MCA v2.0, API v2.0, Component v1.3)
                MCA plm: slurm (MCA v2.0, API v2.0, Component v1.3)
                MCA plm: xgrid (MCA v2.0, API v2.0, Component v1.3)
              MCA filem: rsh (MCA v2.0, API v2.0, Component v1.3)
             MCA errmgr: default (MCA v2.0, API v2.0, Component v1.3)
                MCA ess: env (MCA v2.0, API v2.0, Component v1.3)
                MCA ess: hnp (MCA v2.0, API v2.0, Component v1.3)
MCA ess: singleton (MCA v2.0, API v2.0, Component v1.3)
                MCA ess: slurm (MCA v2.0, API v2.0, Component v1.3)
                MCA ess: tool (MCA v2.0, API v2.0, Component v1.3)
            MCA grpcomm: bad (MCA v2.0, API v2.0, Component v1.3)
            MCA grpcomm: basic (MCA v2.0, API v2.0, Component v1.3)

Do I have to configure OpenMPI-1.3 in a different way than OpenMPI-1.2.8?

Kind regards,
Frank
_______________________________________________
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users


--
Jeff Squyres
Cisco Systems

Reply via email to