I found error in my config.
My hadoop-env.sh in hadoop declares HADOOP_CLASSPATH variable which is
required to work with hbase. but it overwrote HADOOP_CLASSPATH which
hive sets to work with hadoop. So, I changed it does not overwrite but
add values to the variable. Then, it worked.

Thanks,
Shotaro Kamio


On Mon, Sep 7, 2009 at 4:57 PM, Shotaro Kamio<[email protected]> wrote:
> Hello,
>
> I've built and installed hive-trunk with hadoop 0.20.0. But I cannot
> execute query on it.
> I was able to create table 'pokes' in example and load data into it
> successfully. But when I execute query "show tables;" or "SELECT a.foo
> FROM pokes a;", the following exception is raised.
> What is wrong in my setup?
>
> My Hadoop cluster is running with 5 nodes (hadoop 0.20.0 + hbase
> 0.20.0rc1). I'd like to use this cluster as storage for hive. Hive
> package is installed on one of the nodes. Hadoop configuration in hive
> is the same with hadoop cluster.
>
>
> ------ log -----
> hive> show tables;
> 09/09/07 16:47:23 INFO parse.ParseDriver: Parsing command: show tables
> 09/09/07 16:47:23 INFO parse.ParseDriver: Parse Completed
> 09/09/07 16:47:23 INFO ql.Driver: Semantic Analysis Completed
> 09/09/07 16:47:23 INFO ql.Driver: Starting command: show tables
> 09/09/07 16:47:23 INFO metastore.HiveMetaStore: 0: Opening raw store
> with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 09/09/07 16:47:23 INFO metastore.ObjectStore: ObjectStore, initialize called
> 09/09/07 16:47:23 ERROR DataNucleus.Plugin: Bundle
> "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it
> cannot be resolved.
> 09/09/07 16:47:23 ERROR DataNucleus.Plugin: Bundle
> "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it
> cannot be resolved.
> 09/09/07 16:47:23 ERROR DataNucleus.Plugin: Bundle
> "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be
> resolved.
> 09/09/07 16:47:24 INFO metastore.ObjectStore: Initialized ObjectStore
> 09/09/07 16:47:24 INFO metastore.HiveMetaStore: 0: get_tables: db=default 
> pat=.*
> OK
> 09/09/07 16:47:24 INFO ql.Driver: OK
> Failed with exception java.io.IOException:java.io.IOException: Cannot
> create an instance of InputFormat class
> org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
> 09/09/07 16:47:24 ERROR CliDriver: Failed with exception
> java.io.IOException:java.io.IOException: Cannot create an instance of
> InputFormat class org.apache.hadoop.mapred.TextInputFormat as
> specified in mapredWork!
> java.io.IOException: java.io.IOException: Cannot create an instance of
> InputFormat class org.apache.hadoop.mapred.TextInputFormat as
> specified in mapredWork!
>        at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:275)
>        at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:138)
>        at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:447)
>        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:131)
>        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
>        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.io.IOException: Cannot create an instance of
> InputFormat class org.apache.hadoop.mapred.TextInputFormat as
> specified in mapredWork!
>        at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:113)
>        at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:214)
>        at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:251)
>        ... 10 more
>
> Time taken: 2.003 seconds
> 09/09/07 16:47:24 INFO CliDriver: Time taken: 2.003 seconds
> hive> select a.* from pokes a;
> 09/09/07 16:48:48 INFO parse.ParseDriver: Parsing command: select a.*
> from pokes a
> 09/09/07 16:48:48 INFO parse.ParseDriver: Parse Completed
> 09/09/07 16:48:48 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
> 09/09/07 16:48:48 INFO parse.SemanticAnalyzer: Completed phase 1 of
> Semantic Analysis
> 09/09/07 16:48:48 INFO parse.SemanticAnalyzer: Get metadata for source tables
> 09/09/07 16:48:48 INFO metastore.HiveMetaStore: 0: get_table :
> db=default tbl=pokes
> 09/09/07 16:48:48 INFO metastore.HiveMetaStore: 0: Opening raw store
> with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> 09/09/07 16:48:48 INFO metastore.ObjectStore: ObjectStore, initialize called
> 09/09/07 16:48:48 INFO metastore.ObjectStore: Initialized ObjectStore
> 09/09/07 16:48:48 INFO hive.log: DDL: struct pokes { i32 foo, string bar}
> 09/09/07 16:48:49 INFO parse.SemanticAnalyzer: Get metadata for subqueries
> 09/09/07 16:48:49 INFO parse.SemanticAnalyzer: Get metadata for
> destination tables
> 09/09/07 16:48:49 INFO parse.SemanticAnalyzer: Completed getting
> MetaData in Semantic Analysis
> 09/09/07 16:48:49 INFO ppd.OpProcFactory: Processing for FS(2)
> 09/09/07 16:48:49 INFO ppd.OpProcFactory: Processing for SEL(1)
> 09/09/07 16:48:49 INFO ppd.OpProcFactory: Processing for TS(0)
> 09/09/07 16:48:49 INFO parse.SemanticAnalyzer: Completed partition pruning
> 09/09/07 16:48:49 INFO parse.SemanticAnalyzer: Completed sample pruning
> 09/09/07 16:48:49 INFO parse.SemanticAnalyzer: Completed plan generation
> 09/09/07 16:48:49 INFO ql.Driver: Semantic Analysis Completed
> 09/09/07 16:48:49 INFO ql.Driver: Starting command: select a.* from pokes a
> OK
> 09/09/07 16:48:49 INFO ql.Driver: OK
> Failed with exception java.io.IOException:java.io.IOException: Cannot
> create an instance of InputFormat class
> org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
> 09/09/07 16:48:49 ERROR CliDriver: Failed with exception
> java.io.IOException:java.io.IOException: Cannot create an instance of
> InputFormat class org.apache.hadoop.mapred.TextInputFormat as
> specified in mapredWork!
> java.io.IOException: java.io.IOException: Cannot create an instance of
> InputFormat class org.apache.hadoop.mapred.TextInputFormat as
> specified in mapredWork!
>        at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:275)
>        at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:138)
>        at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:447)
>        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:131)
>        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
>        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.io.IOException: Cannot create an instance of
> InputFormat class org.apache.hadoop.mapred.TextInputFormat as
> specified in mapredWork!
>        at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:113)
>        at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:214)
>        at 
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:251)
>        ... 10 more
>
> Time taken: 0.36 seconds
> 09/09/07 16:48:49 INFO CliDriver: Time taken: 0.36 seconds
> ---- end of log -----
>
>
> Best regards,
> Shotaro Kamio
>



-- 
Shotaro Kamio

Reply via email to