[ 
https://issues.apache.org/jira/browse/HADOOP-7979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13192005#comment-13192005
 ] 

Michael Noll commented on HADOOP-7979:
--------------------------------------

@Allen:

> Won't explicitly setting LDFLAGS and CXXFLAGS break gcc's that don't
> know what these flags are?

I am not overly familiar with gcc and ld.  While patching a similar issue for 
Hadoop-LZO we learned that the gcc on Mac OS X Lion doesn't support the 
{{-Wl,\--no-as-needed}} option for LDFLAGS, hence the patch includes a Maven 
profile for not setting LDFLAGS on Macs.  But as you suggested there might be 
other OS'es (Solaris?) or gcc versions that behave differently but I don't have 
any direct experience there to provide further information.

> Can users override?

What would be the preferred way to do this in the current (trunk) Maven setup?  
I have seen that native code compilation must explicitly be enabled by running 
the build with the proper Maven profile, e.g. {{mvn -Pnative ...}}.

At the moment, the patch is designed to work like so:
"By default enable the LDFLAGS option, and only disable it under /these/ 
specific conditions [currently: if running on a Mac box]."  You'd prefer it the 
other way around (disable by default)?

@Arun:

> Can you please check if we need this for native code in other
> parts of Hadoop? E.g. linux-container-executor in MR.

I am still working on this (and it takes some time because re-running the build 
including tests takes several hours).

I have only found a few occasions of native code references in pom.xml files in 
trunk, with linux-container-executor in MR being one of them:

{code}
$ find . -name "pom.xml" | xargs grep native | sed -r 's/^([^:]*):.*/\1/' | 
sort -u
./hadoop-common-project/hadoop-common/pom.xml
./hadoop-hdfs-project/hadoop-hdfs/pom.xml
./hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/pom.xml
./hadoop-project-dist/pom.xml
./hadoop-project/pom.xml
{code}


However, at least for linux-container-executor (i.e. 
{{./hadoop-mapreduce-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager/pom.xml}}),
 I could run the build via {{mvn -Pnative compile}} successfully as it is -- 
even on the "problematic" Ubuntu 11.10 OS version.  Should these components 
still be patched even though they compile without problems in their current 
state of development?
                
> Native code: configure LDFLAGS and CXXFLAGS to fix the build on systems like 
> Ubuntu 11.10
> -----------------------------------------------------------------------------------------
>
>                 Key: HADOOP-7979
>                 URL: https://issues.apache.org/jira/browse/HADOOP-7979
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 0.24.0
>         Environment: Ubuntu 11.10+
>            Reporter: Michael Noll
>            Assignee: Michael Noll
>             Fix For: 0.24.0
>
>         Attachments: HADOOP-7979.trunk.v1.txt
>
>
> I noticed that the build of Hadoop trunk (0.24) and the 1.0/0.20.20x branches 
> fail on Ubuntu 11.10 when trying to include the native code in the build. The 
> reason is that the default behavior of {{ld}} was changed in Ubuntu 11.10.
> *Background*
> From [Ubuntu 11.10 Release 
> Notes|https://wiki.ubuntu.com/OneiricOcelot/ReleaseNotes#GCC_4.6_Toolchain]:
> {code}
>     The compiler passes by default two additional flags to the linker:
>     [...snipp...]
>     -Wl,--as-needed with this option the linker will only add a DT_NEEDED tag
>     for a dynamic library mentioned on the command line if if the library is
>     actually used.
> {code}
> This was apparently planned to be changed already back in 11.04 but was 
> eventually reverted in the final release. From [11.04 Toolchain 
> Transition|https://wiki.ubuntu.com/NattyNarwhal/ToolchainTransition#Indirect_Linking_for_Shared_Libraries]:
> {quote}
> Also in Natty, ld runs with the {{\--as-needed}} option enabled by default.  
> This means that, in the example above, if no symbols from {{libwheel}} were 
> needed by racetrack, then {{libwheel}} would not be linked even if it was 
> explicitly included in the command-line compiler flags. NOTE: The ld 
> {{\--as-needed}} default was reverted for the final natty release, and will 
> be re-enabled in the o-series.
> {quote}
> I already run into the same issue with Hadoop-LZO 
> (https://github.com/kevinweil/hadoop-lzo/issues/33).  See the link and the 
> patch for more details.  For Hadoop, the problematic configure script is 
> {{native/configure}}.
> *How to reproduce*
> There are two ways to reproduce, depending on the OS you have at hand.
> 1. Use a stock Ubuntu 11.10 box and run a build that also compiles the native 
> libs:
> {code}
> # in the top level directory of the 'hadoop-common' repo,
> # i.e. where the BUILDING.txt file resides
> $ mvn -Pnative compile
> {code}
> 2. If you do not have Ubuntu 11.10 at hand, simply add {{-Wl,\--as-needed}} 
> explicitly to {{LDFLAGS}}.  This configures {{ld}} to work like Ubuntu 
> 11.10's default behavior.
> *Error message (for trunk/0.24)*
> Running the above build command will produce the following output (I added 
> {{-e -X}} switches to mvn).
> {code}
> [DEBUG] Executing: /bin/sh -l -c cd 
> /home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native
>  && make 
> DESTDIR=/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/target
>  install
> [INFO] /bin/bash ./libtool  --tag=CC   --mode=compile gcc -DHAVE_CONFIG_H -I. 
>  -I/usr/lib/jvm/default-java/include 
> -I/usr/lib/jvm/default-java/include/linux 
> -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/src
>  
> -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/javah
>  -I/usr/local/include -g -Wall -fPIC -O2 -m64 -g -O2 -MT ZlibCompressor.lo 
> -MD -MP -MF .deps/ZlibCompressor.Tpo -c -o ZlibCompressor.lo `test -f 
> 'src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c' || echo 
> './'`src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c
> [INFO] libtool: compile:  gcc -DHAVE_CONFIG_H -I. 
> -I/usr/lib/jvm/default-java/include -I/usr/lib/jvm/default-java/include/linux 
> -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/src
>  
> -I/home/mnoll/programming/git/hadoop/hadoop-common/hadoop-common-project/hadoop-common/target/native/javah
>  -I/usr/local/include -g -Wall -fPIC -O2 -m64 -g -O2 -MT ZlibCompressor.lo 
> -MD -MP -MF .deps/ZlibCompressor.Tpo -c 
> src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c  -fPIC -DPIC -o 
> .libs/ZlibCompressor.o
> [INFO] src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c: In function 
> 'Java_org_apache_hadoop_io_compress_zlib_ZlibCompressor_initIDs':
> [INFO] src/org/apache/hadoop/io/compress/zlib/ZlibCompressor.c:71:41: error: 
> expected expression before ',' token
> [INFO] make: *** [ZlibCompressor.lo] Error 1
> {code}
> *How to fix*
> The fix involves adding proper settings for {{LDFLAGS}} to the build config.  
> In trunk, this is {{hadoop-common-project/hadoop-common/pom.xml}}.  In 
> branches 1.0 and 0.20.20x, this is {{build.xml}}.
> Basically, the fix explicitly adds {{-Wl,\--no-as-needed}} to {{LDFLAGS}}.  
> Special care must be taken not to add this option when running on Mac OS as 
> its version of ld does not support this option (and does not need it because 
> by default it behaves as desired).

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to