[
https://issues.apache.org/jira/browse/HCATALOG-302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13228814#comment-13228814
]
Alan Gates commented on HCATALOG-302:
-------------------------------------
After applying this patch, when I run
{code}
HADOOP_HOME=/homes/hortonal/grid/mupp/hadoop-1.0.1//
HADOOP_CLASSPATH=${harness.pig.jar}:/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//build/hcatalog/hcatalog-0.4.0.jar:/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//hive/external/build/metastore/hive-metastore-0.8.1.jar:/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//hive/external/build/dist/lib/libthrift.jar:/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//hive/external/build/dist/lib/hive-exec-0.8.1.jar:/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//hive/external/build/dist/lib/libfb303.jar:/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//hive/external/build/dist/lib/jdo2-api-2.3-ec.jar::/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//storage-handlers/hbase/build/ivy/lib/hbase-storage-handler/hbase-0.92.0.jar:/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//storage-handlers/hbase/build/ivy/lib/hbase-storage-handler/zookeeper-3.4.3.jar:/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//storage-handlers/hbase/build/hbase-storage-handler/hbase-storage-handler-0.1.0.jar:/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//hive/external/build/dist/lib/hive-hbase-handler-0.8.1.jar:/homes/hortonal/grid/mupp/hcat//etc/hcatalog:/homes/hortonal/grid/mupp/hbase-0.92.0/conf/
HIVE_HOME=/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../../hive/external/build/dist
HIVE_ROOT=/home/hortonal/src/hcat/top/hdp1/branch-0.4/src/test/e2e/hcatalog/../../../..//hive/external/
/homes/hortonal/grid/mupp/hcat//bin/hcat -e "create table pig_hbase_2_1(key
string, age string, gpa string) STORED BY
'org.apache.hcatalog.hbase.HBaseHCatStorageHandler' TBLPROPERTIES
('hbase.columns.mapping'=':key,info:age,info:gpa');"
{code}
I get:
{code}
FAILED: Error in metadata: MetaException(message:java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@9fdee
closed
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:794)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)
at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:249)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:213)
at
org.apache.hcatalog.hbase.HBaseHCatStorageHandler.preCreateTable(HBaseHCatStorageHandler.java:329)
at
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:396)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:540)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3255)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:238)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1332)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1123)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)
at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42)
at org.apache.hcatalog.cli.HCatCli.processCmd(HCatCli.java:250)
at org.apache.hcatalog.cli.HCatCli.processLine(HCatCli.java:204)
at org.apache.hcatalog.cli.HCatCli.processFile(HCatCli.java:223)
at org.apache.hcatalog.cli.HCatCli.main(HCatCli.java:168)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
)
{code}
> unable to write to hbase channel. HBaseHCatStorageHandler class not found
> -------------------------------------------------------------------------
>
> Key: HCATALOG-302
> URL: https://issues.apache.org/jira/browse/HCATALOG-302
> Project: HCatalog
> Issue Type: Bug
> Components: hbase
> Affects Versions: 0.4
> Reporter: David Capwell
> Assignee: Rohini Palaniswamy
>
> This is the pig script:
> PigServer pigServer = PigServerBuilder.create(this.client.getConf());
> pigServer.registerQuery("A = LOAD '"+input+"' USING PigStorage()
> AS(key:chararray, value:chararray);");
> pigServer.registerQuery("STORE A INTO '"+this.getDb().getName() + "." +
> this.getTable().getName()+"' USING org.apache.hcatalog.pig.HCatStorer();");
> Error:
> 2012-03-09 03:04:12,105 WARN org.apache.hadoop.mapred.Child: Error running
> child
> java.io.IOException: Error in loading storage
> handler.org.apache.hcatalog.hbase.HBaseHCatStorageHandler
> at
> org.apache.hcatalog.common.HCatUtil.getStorageHandler(HCatUtil.java:518)
> at
> org.apache.hcatalog.common.HCatUtil.getStorageHandler(HCatUtil.java:474)
> at
> org.apache.hcatalog.mapreduce.HCatBaseOutputFormat.getOutputFormat(HCatBaseOutputFormat.java:77)
> at
> org.apache.hcatalog.mapreduce.HCatOutputFormat.getOutputCommitter(HCatOutputFormat.java:250)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputCommitter.getCommitters(PigOutputCommitter.java:89)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputCommitter.<init>(PigOutputCommitter.java:67)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getOutputCommitter(PigOutputFormat.java:278)
> at org.apache.hadoop.mapred.Task.initialize(Task.java:515)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:353)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1082)
> at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hcatalog.hbase.HBaseHCatStorageHandler
> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:247)
> at
> org.apache.hcatalog.common.HCatUtil.getStorageHandler(HCatUtil.java:512)
> ... 13 more
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira