[ 
https://issues.apache.org/jira/browse/OOZIE-1154?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13546401#comment-13546401
 ] 

Rohini Palaniswamy commented on OOZIE-1154:
-------------------------------------------

Sorry. I was mistaken. The problem is actually different. It it trying to 
create a successful file for hbase:// since that is the input path. Can you add 
mapreduce.fileoutputcommitter.marksuccessfuljobs as false to the pig action's 
configuration?
                
> Pig action throwing exception when inserting into Hbase table through pig 
> script
> --------------------------------------------------------------------------------
>
>                 Key: OOZIE-1154
>                 URL: https://issues.apache.org/jira/browse/OOZIE-1154
>             Project: Oozie
>          Issue Type: Bug
>          Components: action, workflow
>    Affects Versions: 3.2.0
>         Environment: 
> hadoop-2.0.0-mr1-cdh4.1.2,hbase-0.92.1-cdh4.1.2,pig-0.10.0-cdh4.1.2,oozie-3.2.0.
>  
>            Reporter: shobin joseph
>              Labels: hadoop, hbase, oozie, pig
>             Fix For: 3.2.0
>
>   Original Estimate: 216h
>  Remaining Estimate: 216h
>
> Pig scrit to load data from HDFS to Hbase table was sucessfully executed in 
> psuedo map-reduce mode(using pig -x mapreduce scriptname.pig), but when the 
> same script was executed through oozie workflow,  exception is thrown even 
> though data is loaded into Hbase table. So this exception breaks the oozie 
> workflow.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to