I didn't see this error before. But I would suggest following this tutorial
especially the part related to sharedlib. You should make sure that Oozie
can see Hadoop in your machine including HDFS of course.

http://hadooptutorial.info/apache-oozie-installation-on-ubuntu-14-04/


On Tue, Sep 8, 2015 at 6:12 PM, Dima Fadeyev <[email protected]> wrote:

> Hello, everyone,
>
> I've downoaded Oozie from trunk and compiled with this command:
>   bin/mkdistro.sh -e -Phadoop-2 -DskipTests -Dhadoop.version=2.7.1
>
> When trying to create sharelib directory I'm getting this exception:
>
> bin/oozie-setup.sh sharelib create -fs hdfs://m1.local/  setting
> CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"
>
> Error: File
> /user/bbvoop/share/lib/lib_20150908180437/hive2/jersey-json-1.9.jar could
> only be replicated to 0 nodes instead of minReplication (=1).  There are 3
> datanode(s) running and 3 node(s) are excluded in this operation.
> at
>
> org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1550)
> at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getNewBlockTargets(FSNamesystem.java:3110)
> at
>
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3034)
> at
>
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:723)
> at
>
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:492)
> at
>
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
> at
>
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
> at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
> at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
> at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
>
> I can see the same error in NameNode's logs. The HDFS is working. If I try
> to upload some file with "hdfs dfs -put" everything works.
>
> In datanode logs I see this info message:
> 2015-09-08 18:09:52,905 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: Failed to read expected
> SASL data transfer protection handshake from client at /
> 10.104.18.113:59340.
> Perhaps the client is running an older version of Hadoop which does not
> support SASL data transfer protection
>
> So I guess I either have an incompatibility between my hadoop and my oozie
> libraries or oozie does not know that my hadoop setup is using ssl (and
> probably some other hadoop's configuration is missing). But I don't know
> where to look further?
>
> Could anyone help me resolve this?
>
> Thanks in advance and best regards.
>



-- 
JAK ,

Ahmed Kamal
Junior Big Data Engineer , BadrIT

Reply via email to