Hi,

today I tried to build openmpi-v1.10-dev-41-g57faa88 on my machines
(Solaris 10 Sparc, Solaris 10 x86_64, and openSUSE Linux 12.1
x86_64) with gcc-4.9.2 and Sun C 5.13 and I got the following error
on all platforms with cc. I have already reported the problem a few
days ago. Nathan wanted to fix it and I'm not sure if he hasn't had
time to do it or if it is something different now
(https://github.com/open-mpi/ompi-release/pull/299). I used the
following command to configure the package.

../openmpi-v1.10-dev-59-g4e9cea6/configure 
--prefix=/usr/local/openmpi-1.10.0_64_gcc \
  --libdir=/usr/local/openmpi-1.10.0_64_gcc/lib64 \
  --with-jdk-bindir=/usr/local/jdk1.8.0/bin \
  --with-jdk-headers=/usr/local/jdk1.8.0/include \
  JAVA_HOME=/usr/local/jdk1.8.0 \
  LDFLAGS="-m64" CC="gcc" CXX="g++" FC="gfortran" \
  CFLAGS="-m64" CXXFLAGS="-m64" FCFLAGS="-m64" \
  CPP="cpp" CXXCPP="cpp" \
  CPPFLAGS="" CXXCPPFLAGS="" \
  --enable-mpi-cxx \
  --enable-cxx-exceptions \
  --enable-mpi-java \
  --enable-heterogeneous \
  --enable-mpi-thread-multiple \
  --with-hwloc=internal \
  --without-verbs \
  --with-wrapper-cflags="-std=c11 -m64" \
  --with-wrapper-cxxflags="-m64" \
  --with-wrapper-fcflags="-m64" \
  --enable-debug \
  |& tee log.configure.$SYSTEM_ENV.$MACHINE_ENV.64_gcc


...
  CC       osc_pt2pt_data_move.lo
../../../../../openmpi-v1.10-dev-59-g4e9cea6/ompi/mca/osc/pt2pt/osc_pt2pt_data_move.c:
 In function 'osc_pt2pt_accumulate_buffer':
../../../../../openmpi-v1.10-dev-59-g4e9cea6/ompi/mca/osc/pt2pt/osc_pt2pt_data_move.c:571:20:
 error: 'opal_list_item_t' has no member named 'proc_arch'
     if (proc->super.proc_arch != ompi_proc_local()->super.proc_arch) {
                    ^
../../../../../openmpi-v1.10-dev-59-g4e9cea6/ompi/mca/osc/pt2pt/osc_pt2pt_data_move.c:571:58:
 error: 'opal_list_item_t' has no member named 'proc_arch'
     if (proc->super.proc_arch != ompi_proc_local()->super.proc_arch) {
                                                          ^
make[2]: *** [osc_pt2pt_data_move.lo] Error 1
make[2]: Leaving directory 
`/export2/src/openmpi-1.10.0/openmpi-v1.10-dev-59-g4e9cea6-SunOS.sparc.64_gcc/ompi/mca/osc/pt2pt'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory 
`/export2/src/openmpi-1.10.0/openmpi-v1.10-dev-59-g4e9cea6-SunOS.sparc.64_gcc/ompi'
make: *** [all-recursive] Error 1
tyr openmpi-v1.10-dev-59-g4e9cea6-SunOS.sparc.64_gcc 122 




...
  CC       osc_pt2pt_data_move.lo
"../../../../../openmpi-v1.10-dev-59-g4e9cea6/opal/include/opal/sys/amd64/atomic.h",
 line 136: warning: parameter in inline asm statement unused: %3
"../../../../../openmpi-v1.10-dev-59-g4e9cea6/opal/include/opal/sys/amd64/atomic.h",
 line 182: warning: parameter in inline asm statement unused: %2
"../../../../../openmpi-v1.10-dev-59-g4e9cea6/opal/include/opal/sys/amd64/atomic.h",
 line 203: warning: parameter in inline asm statement unused: %2
"../../../../../openmpi-v1.10-dev-59-g4e9cea6/opal/include/opal/sys/amd64/atomic.h",
 line 224: warning: parameter in inline asm statement unused: %2
"../../../../../openmpi-v1.10-dev-59-g4e9cea6/opal/include/opal/sys/amd64/atomic.h",
 line 245: warning: parameter in inline asm statement unused: %2
"../../../../../openmpi-v1.10-dev-59-g4e9cea6/ompi/mca/osc/pt2pt/osc_pt2pt_data_move.c",
 line 571: improper member use: proc_arch
"../../../../../openmpi-v1.10-dev-59-g4e9cea6/ompi/mca/osc/pt2pt/osc_pt2pt_data_move.c",
 line 571: improper member use: proc_arch
cc: acomp failed for 
../../../../../openmpi-v1.10-dev-59-g4e9cea6/ompi/mca/osc/pt2pt/osc_pt2pt_data_move.c
make[2]: *** [osc_pt2pt_data_move.lo] Error 1
make[2]: Leaving directory 
`/export2/src/openmpi-1.10.0/openmpi-v1.10-dev-59-g4e9cea6-Linux.x86_64.64_cc/ompi/mca/osc/pt2pt'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory 
`/export2/src/openmpi-1.10.0/openmpi-v1.10-dev-59-g4e9cea6-Linux.x86_64.64_cc/ompi'
make: *** [all-recursive] Error 1
linpc1 openmpi-v1.10-dev-59-g4e9cea6-Linux.x86_64.64_cc 197 


I would be grateful, if somebody could fix the problem. Thank
you very much for any help in advance.


Kind regards

Siegmar

Reply via email to