[ 
https://issues.apache.org/jira/browse/HDFS-7774?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14337225#comment-14337225
 ] 

Chris Nauroth commented on HDFS-7774:
-------------------------------------

Kiran, thank you for this patch.  It looks pretty good.

For parameterizing the CMake generator, I recommend using an ant condition to 
evaluate a generator property based on the value of the {{PLATFORM}} 
environment variable.  Then, that generator property can be used in place of 
the hard-coded string in the cmake call.  Something like this ought to work:

{code}
                    <condition property="generator" value="Visual Studio 10" 
else="Visual Studio 10 Win64">
                      <equals arg1="Win32" arg2="${env.PLATFORM}" />
                    </condition>
                    <mkdir dir="${project.build.directory}/native"/>
                    <exec executable="cmake" 
dir="${project.build.directory}/native"
                        failonerror="true">
                        <arg line="${basedir}/src/ 
-DGENERATED_JAVAH=${project.build.directory}/native/javah 
-DJVM_ARCH_DATA_MODEL=${sun.arch.data.model} 
-DREQUIRE_LIBWEBHDFS=${require.libwebhdfs} -DREQUIRE_FUSE=${require.fuse} -G 
'${generator}'"/>
{code}

Are the libhdfs tests passing for you with a 32-bit build?  I consistently get 
{{OutOfMemoryError}} while trying to create a thread:

{code}
     [exec] nmdCreate: Builder#build error:
     [exec] java.lang.OutOfMemoryError: unable to create new native thread
     [exec]     at java.lang.Thread.start0(Native Method)
     [exec]     at java.lang.Thread.start(Thread.java:714)
     [exec]     at 
io.netty.util.concurrent.SingleThreadEventExecutor.shutdownGracefully(SingleThreadEventExecutor.java:557)
     [exec]     at 
io.netty.util.concurrent.MultithreadEventExecutorGroup.shutdownGracefully(MultithreadEventExecutorGroup.java:146)
     [exec]     at 
io.netty.util.concurrent.AbstractEventExecutorGroup.shutdownGracefully(AbstractEventExecutorGroup.java:69)
     [exec]     at 
org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.close(DatanodeHttpServer.java:165)
     [exec]     at 
org.apache.hadoop.hdfs.server.datanode.DataNode.shutdown(DataNode.java:1632)
     [exec]     at 
org.apache.hadoop.hdfs.MiniDFSCluster.shutdownDataNodes(MiniDFSCluster.java:1740)
     [exec]     at 
org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1715)
     [exec]     at 
org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1699)
     [exec]     at 
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:836)
     [exec]     at 
org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:466)
     [exec]     at 
org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:425)
     [exec] TEST_ERROR: failed on 
..\..\src\main\native\libhdfs\test_libhdfs_threaded.c:330 (errno: 12): got NULL 
from tlhCluster
{code}


> Unresolved symbols error while compiling HDFS on Windows 7/32 bit
> -----------------------------------------------------------------
>
>                 Key: HDFS-7774
>                 URL: https://issues.apache.org/jira/browse/HDFS-7774
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: build, native
>    Affects Versions: 2.6.0
>         Environment: Windows 7, 32 bit, Visual Studio 10.
> Windows PATH: 
> PATH=C:\ProgramData\Oracle\Java\javapath;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;D:\PIG\pig-0.13.0\bin;C:\PROGRA~1\JAVA\JDK1.7.0_71\bin;C:\Program
>  Files\Microsoft Windows Performance 
> Toolkit\;C:\GNUWIN32\GETGNUWIN32\BIN;C:\CYGWIN\BIN;D:\git\cmd;D:\GIT\BIN;D:\MAVEN-3-2-3\APACHE-MAVEN-3.2.3-BIN\apache-maven-3.2.3\bin;D:\UTILS;c:\windows\Microsoft.NET\Framework\v4.0.30319;D:\cmake\bin;c:\progra~1\Micros~1.0\vc\crt\src;
> SDK Path:
> PATH=C:\Windows\Microsoft.NET\Framework\v4.0.30319;C:\Windows\Microsoft.NET\Framework\v3.5;;C:\Program
>  Files\Microsoft Visual Studio 10.0\Common7\IDE;C:\Program Files\Microsoft 
> Visual Studio 10.0\Common7\Tools;;C:\Program Files\Microsoft Visual Studio 
> 10.0\VC\Bin;C:\Program Files\Microsoft Visual Studio 
> 10.0\VC\Bin\VCPackages;;C:\Program Files\Microsoft 
> SDKs\Windows\v7.1\Bin\NETFX 4.0 Tools;C:\Program Files\Microsoft 
> SDKs\Windows\v7.1\Bin;;C:\ProgramData\Oracle\Java\javapath;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;D:\PIG\pig-0.13.0\bin;C:\PROGRA~1\JAVA\JDK1.7.0_71\bin;C:\Program
>  Files\Microsoft Windows Performance 
> Toolkit\;C:\GNUWIN32\GETGNUWIN32\BIN;C:\CYGWIN\BIN;D:\git\cmd;D:\GIT\BIN;D:\MAVEN-3-2-3\APACHE-MAVEN-3.2.3-BIN\apache-maven-3.2.3\bin;D:\UTILS;c:\windows\Microsoft.NET\Framework\v4.0.30319;D:\cmake\bin;c:\progra~1\Micros~1.0\vc\crt\src;
>            Reporter: Venkatasubramaniam Ramakrishnan
>            Assignee: Kiran Kumar M R
>            Priority: Critical
>              Labels: build
>         Attachments: HDFS-7774-001.patch, Win32_Changes-temp.patch
>
>
> I am getting the following error in the hdfs module compilation:
> .
> .
> .
>   [exec] ClCompile:
>      [exec]   All outputs are up-to-date.
>      [exec] Lib:
>      [exec]   All outputs are up-to-date.
>      [exec]   hdfs_static.vcxproj -> 
> D:\h\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\target\bin\RelWithDebInfo\hdfs.lib
>      [exec] FinalizeBuildStatus:
>      [exec]   Deleting file 
> "hdfs_static.dir\RelWithDebInfo\hdfs_static.unsuccessfulbuild".
>      [exec]   Touching 
> "hdfs_static.dir\RelWithDebInfo\hdfs_static.lastbuildstate".
>      [exec] Done Building Project 
> "D:\h\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\hdfs_static.vcxproj"
>  (default targets).
>      [exec] Done Building Project 
> "D:\h\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\ALL_BUILD.vcxproj"
>  (default targets) -- FAILED.
>      [exec] 
>      [exec] Build FAILED.
>      [exec] 
>      [exec] 
> "D:\h\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\ALL_BUILD.vcxproj"
>  (default target) (1) ->
>      [exec] 
> "D:\h\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\hdfs.vcxproj"
>  (default target) (3) ->
>      [exec] (Link target) -> 
>      [exec]   thread_local_storage.obj : error LNK2001: unresolved external 
> symbol _tls_used 
> [D:\h\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\hdfs.vcxproj]
>      [exec]   thread_local_storage.obj : error LNK2001: unresolved external 
> symbol pTlsCallback 
> [D:\h\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\hdfs.vcxproj]
>      [exec]   
> D:\h\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\target\bin\RelWithDebInfo\hdfs.dll
>  : fatal error LNK1120: 2 unresolved externals 
> [D:\h\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\hdfs.vcxproj]
>      [exec] 
>      [exec]     0 Warning(s)
>      [exec]     3 Error(s)
>      [exec] 
>      [exec] Time Elapsed 00:00:40.39
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO] 
> [INFO] Apache Hadoop HDFS ................................. FAILURE [02:27 
> min]
> [INFO] Apache Hadoop HttpFS ............................... SKIPPED



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to