[ 
https://issues.apache.org/jira/browse/MAPREDUCE-1270?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13123646#comment-13123646
 ] 

linyihang commented on MAPREDUCE-1270:
--------------------------------------

Hello Mr Yang,
      I am new to HCE,when I download the HCE 1.0 and try to build it by 
"./build.sh",but failed.there is the errors:
      "../hadoop_hce_v1/hadoop-0.20.3/../java6/jre/bin/java: 1: Syntax error: 
"(" unexpected
       ../hadoop_hce_v1/hadoop-0.20.3/../java6/jre/bin/java: 1: Syntax error: 
"(" unexpected
      "
      Then I modify the build.sh by comment the lines below,
      " # prepare the java and ant ENV
        #export JAVA_HOME=${workdir}/../java6
        #export ANT_HOME=${workdir}/../ant
        #export PATH=${JAVA_HOME}/bin:${ANT_HOME}/bin:$PATH",for my having 
install jdk1.6.0_21 and the ant1.8.2;
      But there seems to be some errors like 
      "
      [exec] /usr/include/linux/tcp.h:77: error: ‘__u32 __fswab32(__u32)’ 
cannot appear in a constant-expression
      ",and never meet "BUIED SUCCESSFULY" as the InstallMenu.pdf showing.My OS 
is UBUNTU 10.10 ,and GCC is version 4.4.5.

      Is there any error I hava make?

      Something further,then I try to fix the error as someguy strikes me on 
Google by replacing "#include <linux/tcp.h>" with "#include<netinet/tcp.h>" in 
the "../hadoop_hce_v1/hadoop-0.20.3/src/c++/hce/impl/Commo
n/Type.hh",also add "#include<stdint.h>" in the Type.hh. But there takes place 
to be a lot of mistakes(which I thought) like mistaking "printf("%lld")" by 
"printf("lld")",and one serious error as follow witch worry me a lot.
      the serious error is ,
      "
       [exec]   then mv -f ".deps/CompressionFactory.Tpo" 
".deps/CompressionFactory.Po"; else rm -f ".deps/CompressionFactory.Tpo"; exit 
1; fi
     [exec] In file included from /usr/include/limits.h:153,
     [exec]                  from 
/usr/lib/gcc/i686-linux-gnu/4.4.5/include-fixed/limits.h:122,
     [exec]                  from 
/usr/lib/gcc/i686-linux-gnu/4.4.5/include-fixed/syslimits.h:7,
     [exec]                  from 
/usr/lib/gcc/i686-linux-gnu/4.4.5/include-fixed/limits.h:11,
     [exec]                  from 
/home/had/文档/HCE/bak/hadoop_hce_v1/hadoop-0.20.3/src/c++/hce/impl/../../../../nativelib/lzo/lzo/lzoconf.h:52,
     [exec]                  from 
/home/had/文档/HCE/bak/hadoop_hce_v1/hadoop-0.20.3/src/c++/hce/impl/../../../../nativelib/lzo/lzo/lzo1.h:45,
     [exec]                  from 
/home/had/文档/HCE/bak/hadoop_hce_v1/hadoop-0.20.3/src/c++/hce/impl/Compress/LzoCompressor.hh:23,
     [exec]                  from 
/home/had/文档/HCE/bak/hadoop_hce_v1/hadoop-0.20.3/src/c++/hce/impl/Compress/LzoCodec.hh:27,
     [exec]                  from 
/home/had/文档/HCE/bak/hadoop_hce_v1/hadoop-0.20.3/src/c++/hce/impl/Compress/CompressionFactory.cc:23:
     [exec] /usr/include/bits/xopen_lim.h:95: error: missing binary operator 
before token "("
     [exec] /usr/include/bits/xopen_lim.h:98: error: missing binary operator 
before token "("
     [exec] /usr/include/bits/xopen_lim.h:122: error: missing binary operator 
before token "("
     [exec] make[1]:正在离开目录 
`/home/had/文档/HCE/bak/hadoop_hce_v1/hadoop-0.20.3/build/c++-build/Linux-i386-32/hce/impl/Compress'
     [exec] make[1]: *** [CompressionFactory.o] 错误 1
     [exec] make: *** [install-recursive] 错误 1
      "

      How can I fix this error?

                
> Hadoop C++ Extention
> --------------------
>
>                 Key: MAPREDUCE-1270
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-1270
>             Project: Hadoop Map/Reduce
>          Issue Type: Improvement
>          Components: task
>    Affects Versions: 0.20.1
>         Environment:  hadoop linux
>            Reporter: Wang Shouyan
>         Attachments: HADOOP-HCE-1.0.0.patch, HCE InstallMenu.pdf, HCE 
> Performance Report.pdf, HCE Tutorial.pdf, Overall Design of Hadoop C++ 
> Extension.doc
>
>
>   Hadoop C++ extension is an internal project in baidu, We start it for these 
> reasons:
>    1  To provide C++ API. We mostly use Streaming before, and we also try to 
> use PIPES, but we do not find PIPES is more efficient than Streaming. So we 
> think a new C++ extention is needed for us.
>    2  Even using PIPES or Streaming, it is hard to control memory of hadoop 
> map/reduce Child JVM.
>    3  It costs so much to read/write/sort TB/PB data by Java. When using 
> PIPES or Streaming, pipe or socket is not efficient to carry so huge data.
>    What we want to do: 
>    1 We do not use map/reduce Child JVM to do any data processing, which just 
> prepares environment, starts C++ mapper, tells mapper which split it should  
> deal with, and reads report from mapper until that finished. The mapper will 
> read record, ivoke user defined map, to do partition, write spill, combine 
> and merge into file.out. We think these operations can be done by C++ code.
>    2 Reducer is similar to mapper, it was started after sort finished, it 
> read from sorted files, ivoke user difined reduce, and write to user defined 
> record writer.
>    3 We also intend to rewrite shuffle and sort with C++, for efficience and 
> memory control.
>    at first, 1 and 2, then 3.  
>    What's the difference with PIPES:
>    1 Yes, We will reuse most PIPES code.
>    2 And, We should do it more completely, nothing changed in scheduling and 
> management, but everything in execution.
> *UPDATE:*
> Now you can get a test version of HCE from this link 
> http://docs.google.com/leaf?id=0B5xhnqH1558YZjcxZmI0NzEtODczMy00NmZiLWFkNjAtZGM1MjZkMmNkNWFk&hl=zh_CN&pli=1
> This is a full package with all hadoop source code.
> Following document "HCE InstallMenu.pdf" in attachment, you will build and 
> deploy it in your cluster.
> Attachment "HCE Tutorial.pdf" will lead you to write the first HCE program 
> and give other specifications of the interface.
> Attachment "HCE Performance Report.pdf" gives a performance report of HCE 
> compared to Java MapRed and Pipes.
> Any comments are welcomed.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira


Reply via email to