You should also verify that the Pig job works as expected when just issuing DUMP and that HBase works as expected when manually inserting a few rows (without Pig).
On Wed, Mar 7, 2012 at 8:45 AM, Raghu Angadi <[email protected]> wrote: > > More information at: http://hadoop1:50030/**jobdetails.jsp?jobid=job_** > 201203071602_0001< > http://hadoop1:50030/jobdetails.jsp?jobid=job_201203071602_0001> > > did you check that? From that link you can also navigate to output from > mapper task. > did you create "info" column family in the table? > > On Wed, Mar 7, 2012 at 7:47 AM, Marcin Cylke <[email protected]> wrote: > > > Hi > > > > I'm following a short tutorial from http://blog.whitepages.com/** > > 2011/10/27/hbase-storage-and-**pig/< > http://blog.whitepages.com/2011/10/27/hbase-storage-and-pig/> > > > > I have a running HBase cluster and Hadoop cluster. > > > > Steps I've performed: > > - prepared a sample input file and put it on HDFS. > > - created a table in HBase > > - created a script file with contents: > > > > raw_data = LOAD 'sample_data.csv' USING PigStorage( ',' ) AS ( > > listing_id: chararray, > > fname: chararray, > > lname: chararray ); > > > > STORE raw_data INTO 'hbase://sample_names' USING > > org.apache.pig.backend.hadoop.**hbase.HBaseStorage ( > > 'info:fname info:lname'); > > > > and the whole thing freezes. It gets to the point saying > > > > HadoopJobId: job_201203071602_0001 > > More information at: http://hadoop1:50030/**jobdetails.jsp?jobid=job_** > > 201203071602_0001< > http://hadoop1:50030/jobdetails.jsp?jobid=job_201203071602_0001> > > > > and after some time the job fails. The trace log from this run contains > > only this exception: > > > > Pig Stack Trace > > --------------- > > ERROR 2244: Job failed, hadoop does not return any error message > > > > org.apache.pig.backend.**executionengine.ExecException: ERROR 2244: Job > > failed, hadoop does not return any error message > > at org.apache.pig.tools.grunt.**GruntParser.executeBatch(** > > GruntParser.java:139) > > at org.apache.pig.tools.grunt.**GruntParser.parseStopOnError(** > > GruntParser.java:192) > > at org.apache.pig.tools.grunt.**GruntParser.parseStopOnError(** > > GruntParser.java:164) > > at org.apache.pig.tools.grunt.**Grunt.exec(Grunt.java:84) > > at org.apache.pig.Main.run(Main.**java:561) > > at org.apache.pig.Main.main(Main.**java:111) > > at sun.reflect.**NativeMethodAccessorImpl.**invoke0(Native > Method) > > at sun.reflect.**NativeMethodAccessorImpl.**invoke(** > > NativeMethodAccessorImpl.java:**39) > > at sun.reflect.**DelegatingMethodAccessorImpl.**invoke(** > > DelegatingMethodAccessorImpl.**java:25) > > at java.lang.reflect.Method.**invoke(Method.java:597) > > at org.apache.hadoop.util.RunJar.**main(RunJar.java:156) > > ==============================**==============================** > > ==================== > > > > > > I've atteched logs from this execution (pig.log) > > > > My environment: > > - HBase 0.92.0 > > - pig 0.9.2 > > - hadoop 1.0.0 > > > > Is there something wrong in my doing? > > > > Regards > > Marcin > > > > > -- *Note that I'm no longer using my Yahoo! email address. Please email me at [email protected] going forward.*
