[
https://issues.apache.org/jira/browse/HADOOP-3344?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12594750#action_12594750
]
craigm edited comment on HADOOP-3344 at 5/6/08 6:03 PM:
-----------------------------------------------------------------
Ok, this is first attempt at an autotools build system.
Notes/issues:
* It doesn't yet work. I havent figured out how to get .so shared files built,
so building the tests fail
* The Java-related autoconf macros came from Apache Commons, Daemon. Hope this
is OK?
* I still need to test on other linux architectures. Will need other
volunteers for alternative platforms.
* Requires autoconf 2.61, for the AC_TYPE_INT64_T and similar macros
To build configure & Makefile, do:
{noformat}
autoreconf; libtoolize && aclocal -I ../utils/m4/ && automake -a --foreign &&
autoconf
{noformat}
For normal building:
{noformat}
./configure && make
{noformat}
Can anyone provide assistance in completing this?
was (Author: craigm):
Ok, this is first attempt at an autotools build system.
Notes/issues:
* It doesn't yet work. I havent figured out how to get .so shared files built,
so building the tests fail
* The autoconf macros came from Apache Commons, Daemon. Hope this is OK?
* I still need to test on other linux architectures. Will need other
volunteers for alternative platforms.
* Requires autoconf 2.61, for the AC_TYPE_INT64_T and similar macros
To build configure & Makefile, do:
{noformat}
autoreconf; libtoolize && aclocal -I ../utils/m4/ && automake -a --foreign &&
autoconf
{noformat}
For normal building:
{noformat}
./configure && make
{noformat}
Can anyone provide assistance in completing this?
> libhdfs: always builds 32bit, even when x86_64 Java used
> --------------------------------------------------------
>
> Key: HADOOP-3344
> URL: https://issues.apache.org/jira/browse/HADOOP-3344
> Project: Hadoop Core
> Issue Type: Bug
> Components: build, libhdfs
> Environment: x86_64 linux, x86_64 Java installed
> Reporter: Craig Macdonald
> Attachments: HADOOP-3344.v0.patch
>
>
> The makefile for libhdfs is hard-coded to compile 32bit libraries. It should
> perhaps compile dependent on which Java is set.
> The relevant lines are:
> LDFLAGS = -L$(JAVA_HOME)/jre/lib/$(OS_ARCH)/server -ljvm -shared -m32 -Wl,-x
> CPPFLAGS = -m32 -I$(JAVA_HOME)/include -I$(JAVA_HOME)/include/$(PLATFORM)
> $OS_ARCH can be (e.g.) amd64 if you're using a 64bit java on the x86_64
> platform. So while gcc will try to link against the correct libjvm.so, it
> will fail because libhdfs is to be built 32bit (because of -m32)
> {noformat}
> [exec] /usr/bin/ld: skipping incompatible
> /usr/java64/latest/jre/lib/amd64/server/libjvm.so when searching for -ljvm
> [exec] /usr/bin/ld: cannot find -ljvm
> [exec] collect2: ld returned 1 exit status
> [exec] make: *** [/root/def/hadoop-0.16.3/build/libhdfs/libhdfs.so.1]
> Error 1
> {noformat}
> The solution should be to specify -m32 or -m64 depending on the os.arch
> detected.
> There are 3 cases to check:
> * 32bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
> * 64bit OS, 32bit java => libhdfs should be built 32bit, specify -m32
> * 64bit OS, 64bit java => libhdfs should be built 64bit, specify -m64
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.