Hey Trevor,

Thanks for contributing.   Supporting ARM on Hadoop will require a
number of different changes right? Eg given that Hadoop currently
depends on some Sun-specific classes, and requires a Sun-compatible
JVM you'll have to work around this dependency somehow, there's not a
Sun JVM for ARM right?

If there's a handful of additional changes then let's make an umbrella
jira for Hadoop ARM support and make the issues you've already filed
sub-tasks. You can ping me off-line on how to that if you want.
Supporting non-x86 processors and non-gcc compilers is an additional
maintenance burden on the project so it would be helpful to have an
end-game figured out so these patches don't bitrot in the meantime.

Thanks,
Eli

On Tue, May 10, 2011 at 5:13 PM, Trevor Robinson <tre...@scurrilous.com> wrote:
> Is the native build failing on ARM (where gcc doesn't support -m32) a
> known issue, and is there a workaround or fix pending?
>
> $ ant -Dcompile.native=true
> ...
>      [exec] make  all-am
>      [exec] make[1]: Entering directory
> `/home/trobinson/dev/hadoop-common/build/native/Linux-arm-32'
>      [exec] /bin/bash ./libtool  --tag=CC   --mode=compile gcc
> -DHAVE_CONFIG_H -I. -I/home/trobinson/dev/hadoop-common/src/native
> -I/usr/lib/jvm/java-6-openjdk/include
> -I/usr/lib/jvm/java-6-openjdk/include/linux
> -I/home/trobinson/dev/hadoop-common/src/native/src
> -Isrc/org/apache/hadoop/io/compress/zlib
> -Isrc/org/apache/hadoop/security -Isrc/org/apache/hadoop/io/nativeio/
> -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF
> .deps/ZlibCompressor.Tpo -c -o ZlibCompressor.lo `test -f
> 'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' || echo
> '/home/trobinson/dev/hadoop-common/src/native/'`src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
>      [exec] libtool: compile:  gcc -DHAVE_CONFIG_H -I.
> -I/home/trobinson/dev/hadoop-common/src/native
> -I/usr/lib/jvm/java-6-openjdk/include
> -I/usr/lib/jvm/java-6-openjdk/include/linux
> -I/home/trobinson/dev/hadoop-common/src/native/src
> -Isrc/org/apache/hadoop/io/compress/zlib
> -Isrc/org/apache/hadoop/security -Isrc/org/apache/hadoop/io/nativeio/
> -g -Wall -fPIC -O2 -m32 -g -O2 -MT ZlibCompressor.lo -MD -MP -MF
> .deps/ZlibCompressor.Tpo -c
> /home/trobinson/dev/hadoop-common/src/native/src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
>  -fPIC -DPIC -o .libs/ZlibCompressor.o
>      [exec] make[1]: Leaving directory
> `/home/trobinson/dev/hadoop-common/build/native/Linux-arm-32'
>      [exec] cc1: error: unrecognized command line option "-m32"
>      [exec] make[1]: *** [ZlibCompressor.lo] Error 1
>      [exec] make: *** [all] Error 2
>
> This closest issue I can find is
> https://issues.apache.org/jira/browse/HADOOP-6258 (Native compilation
> assumes gcc), as well as other issues regarding where and how to
> specify -m32/64. However, there doesn't seem to be a specific issue
> covering build failure on systems using gcc where the gcc target does
> not support -m32/64 (such as ARM).
>
> I've attached a patch that disables specifying -m$(JVM_DATA_MODEL)
> when $host_cpu starts with "arm". (For instance, host_cpu = armv7l for
> my system.) To any maintainers on this list, please let me know if
> you'd like me to open a new issue and/or attach this patch to an
> issue.
>
> Thanks,
> Trevor
>

Reply via email to