Hi,

I have written oozie java action which interacts with HBase. I am able to 
submit coordinator job successfully, however when oozie tries to submit 
workflow job for coordinator I am getting below exception in oozie.log on oozie 
server. 

Exception:

2016-03-24 01:32:07,772  WARN ActionStartXCommand:546 - SERVER[example-qa1-dn3] 
USER[admin] GROUP[-] TOKEN[] APP[sqoop-dataimport-wf] 
JOB[0000011-160324011645765-oozie-oozi-W] 
ACTION[0000011-160324011645765-oozie-oozi-W@java-node] Error starting action 
[java-node]. ErrorType [ERROR], ErrorCode [ELException], Message [ELException: 
variable [hbase] cannot be resolved]
org.apache.oozie.action.ActionExecutorException: ELException: variable [hbase] 
cannot be resolved
        at 
org.apache.oozie.action.ActionExecutor.convertException(ActionExecutor.java:401)
        at 
org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:980)
        at 
org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1135)
        at 
org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:228)
        at 
org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:63)
        at org.apache.oozie.command.XCommand.call(XCommand.java:281)
        at 
org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:323)
        at 
org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:252)
        at 
org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: javax.servlet.jsp.el.ELException: variable [hbase] cannot be resolved
        at 
org.apache.oozie.util.ELEvaluator$Context.resolveVariable(ELEvaluator.java:106)
        at org.apache.commons.el.NamedValue.evaluate(NamedValue.java:124)
        at org.apache.commons.el.ComplexValue.evaluate(ComplexValue.java:140)
        at 
org.apache.commons.el.ExpressionString.evaluate(ExpressionString.java:114)
        at 
org.apache.commons.el.ExpressionEvaluatorImpl.evaluate(ExpressionEvaluatorImpl.java:274)
        at 
org.apache.commons.el.ExpressionEvaluatorImpl.evaluate(ExpressionEvaluatorImpl.java:190)
        at org.apache.oozie.util.ELEvaluator.evaluate(ELEvaluator.java:203)
        at 
org.apache.oozie.action.hadoop.JavaActionExecutor.getCredProperties(JavaActionExecutor.java:1111)
        at 
org.apache.oozie.action.hadoop.JavaActionExecutor.getActionCredentialsProperties(JavaActionExecutor.java:1077)
        at 
org.apache.oozie.action.hadoop.JavaActionExecutor.setCredentialPropertyToActionConf(JavaActionExecutor.java:1020)
        at 
org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:907)
        ... 10 more

As per stacktrace variable hbase cannot be resolved. But If you see my workflow 
xml I have used variable hbase for specifying credentials for hbase and also I 
have specified credentials class for hbase type in oozie-site.xml 

PFB coordinator and workflow xmls:

Coordinatior.xml

<coordinator-app name="${appName}" frequency="${frequency}" 
start="${startTime}" end="${endTime}" timezone="UTC" 
xmlns="uri:oozie:coordinator:0.4">
   <action>
      <workflow>
         <app-path>${WORKFLOW_APP_PATH}</app-path>
      </workflow>
   </action>
</coordinator-app>

Workflow.xml:
<workflow-app name="sqoop-dataimport-wf" xmlns="uri:oozie:workflow:0.4">
<credentials>
                   
         <credential name='hcat-cred' type='hcat'>
            <property>
               <name>hcat.metastore.uri</name>
               <value>thrift://example-dev1-dn2:9083</value>
            </property>
            <property>
               <name>hcat.metastore.principal</name>
               <value>hive/[email protected]</value>
            </property>
         </credential>

                <credential name='hbase-cred' type='hbase'>
                         <property>
      <name>dfs.domain.socket.path</name>
      <value>/var/lib/hadoop-hdfs/dn_socket</value>
    </property>
    
    <property>
      <name>hbase.bulkload.staging.dir</name>
      <value>/apps/hbase/staging</value>
    </property>
    
    <property>
      <name>hbase.client.keyvalue.maxsize</name>
      <value>1048576</value>
    </property>
    
    <property>
      <name>hbase.client.retries.number</name>
      <value>35</value>
    </property>
    
    <property>
      <name>hbase.client.scanner.caching</name>
      <value>100</value>
    </property>
    
    <property>
      <name>hbase.cluster.distributed</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hbase.coprocessor.master.classes</name>
      <value>org.apache.hadoop.hbase.security.access.AccessController</value>
    </property>
    
    <property>
      <name>hbase.coprocessor.region.classes</name>
      
<value>org.apache.hadoop.hbase.security.token.TokenProvider,org.apache.hadoop.hbase.security.access.SecureBulkLoadEndpoint,org.apache.hadoop.hbase.security.access.AccessController</value>
    </property>
    
    <property>
      <name>hbase.coprocessor.regionserver.classes</name>
      <value></value>
    </property>
    
    <property>
      <name>hbase.defaults.for.version.skip</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hbase.hregion.majorcompaction</name>
      <value>604800000</value>
    </property>
    
    <property>
      <name>hbase.hregion.majorcompaction.jitter</name>
      <value>0.50</value>
    </property>
    
    <property>
      <name>hbase.hregion.max.filesize</name>
      <value>10737418240</value>
    </property>
    
    <property>
      <name>hbase.hregion.memstore.block.multiplier</name>
      <value>4</value>
    </property>
    
    <property>
      <name>hbase.hregion.memstore.flush.size</name>
      <value>134217728</value>
    </property>
    
    <property>
      <name>hbase.hregion.memstore.mslab.enabled</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hbase.hstore.blockingStoreFiles</name>
      <value>10</value>
    </property>
    
    <property>
      <name>hbase.hstore.compaction.max</name>
      <value>10</value>
    </property>
    
    <property>
      <name>hbase.hstore.compactionThreshold</name>
      <value>3</value>
    </property>
    
    <property>
      <name>hbase.master.info.bindAddress</name>
      <value>0.0.0.0</value>
    </property>
    
    <property>
      <name>hbase.master.info.port</name>
      <value>16010</value>
    </property>
    
    <property>
      <name>hbase.master.kerberos.principal</name>
      <value>hbase/[email protected]</value>
    </property>
    
    <property>
      <name>hbase.master.keytab.file</name>
      <value>/etc/security/keytabs/hbase.service.keytab</value>
    </property>
    
    <property>
      <name>hbase.master.port</name>
      <value>16000</value>
    </property>
    
    <property>
      <name>hbase.regionserver.global.memstore.size</name>
      <value>0.4</value>
    </property>
    
    <property>
      <name>hbase.regionserver.handler.count</name>
      <value>30</value>
    </property>
    
    <property>
      <name>hbase.regionserver.info.port</name>
      <value>16030</value>
    </property>
    
    <property>
      <name>hbase.regionserver.kerberos.principal</name>
      <value>hbase/[email protected]</value>
    </property>
    
    <property>
      <name>hbase.regionserver.keytab.file</name>
      <value>/etc/security/keytabs/hbase.service.keytab</value>
    </property>
    
    <property>
      <name>hbase.regionserver.port</name>
      <value>16020</value>
    </property>
    
    <property>
      <name>hbase.regionserver.wal.codec</name>
      <value>org.apache.hadoop.hbase.regionserver.wal.WALCellCodec</value>
    </property>
    
    <property>
      <name>hbase.rootdir</name>
      <value>hdfs://example-dev1-nn:8020/apps/hbase/data</value>
    </property>
    
    <property>
      <name>hbase.rpc.controllerfactory.class</name>
      <value></value>
    </property>
    
    <property>
      <name>hbase.rpc.protection</name>
      <value>authentication</value>
    </property>
    
    <property>
      <name>hbase.rpc.timeout</name>
      <value>90000</value>
    </property>
    
    <property>
      <name>hbase.security.authentication</name>
      <value>kerberos</value>
    </property>
    
    <property>
      <name>hbase.security.authorization</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hbase.superuser</name>
      <value>hbase</value>
    </property>
    
    <property>
      <name>hbase.zookeeper.property.clientPort</name>
      <value>2181</value>
    </property>
    
    <property>
      <name>hbase.zookeeper.quorum</name>
      <value>example-dev1-dn1,example-dev1-dn3,example-dev1-dn4</value>
    </property>
    
    <property>
      <name>hbase.zookeeper.useMulti</name>
      <value>true</value>
    </property>
    
    <property>
      <name>hfile.block.cache.size</name>
      <value>0.40</value>
    </property>
    
    <property>
      <name>phoenix.connection.autoCommit</name>
      <value>true</value>
    </property>
    
    <property>
      <name>phoenix.functions.allowUserDefinedFunctions</name>
      <value> </value>
    </property>
    
    <property>
      <name>phoenix.query.timeoutMs</name>
      <value>60000</value>
    </property>
    
    <property>
      <name>phoenix.queryserver.kerberos.principal</name>
      <value>hbase/[email protected]</value>
    </property>
    
    <property>
      <name>phoenix.queryserver.keytab.file</name>
      <value>/etc/security/keytabs/hbase.service.keytab</value>
    </property>
    
    <property>
      <name>zookeeper.session.timeout</name>
      <value>90000</value>
    </property>
    
    <property>
      <name>zookeeper.znode.parent</name>
      <value>/hbase-secure</value>
    </property>

                </credential>


        </credentials>

        <start to="java-node" />
        <action name="java-node" cred='hbase-cred,hcat-cred'>

                <java>
                        <job-tracker>${jobTracker}</job-tracker>
                        <name-node>${nameNode}</name-node>
                        
<main-class>com.demo.example.ingestion.dbextract.DataTransferTool</main-class>
                        <arg>incremental-job</arg>
                        <arg>--org-code</arg>
                        <arg>${ORGANIZATION_CODE}</arg>
                        <arg>--ss-code</arg>
                        <arg>${SOURCE_SYSTEM_CODE}</arg>
                        <arg>--config-file</arg>
                        <arg>${INCREMENTAL_COLUMNS_CONFIG_FILEPATH}</arg>
                        <arg>--hive-import</arg>
                        <arg>--workflow-id</arg>
                        <arg>${wf:id()}</arg>
                        <arg>--job-name</arg>
                        <arg>${JOB_NAME}</arg>
                </java>
                <ok to="end"/>
                <error to="fail"/>
        </action>
        <kill name="fail">
                <message>Java failed, error 
message[${wf:errorMessage(wf:lastErrorNode())}]</message>
        </kill>
        <end name="end"/>
</workflow-app>


Can anyone help me out for this issue?

Thanks & Regards,
Ashish

Reply via email to