On Mon, Apr 5, 2010 at 7:23 PM, Sagar Naik <[email protected]> wrote:
> Thanks for responses,
> I am still not able to get it running
> Comments inline, Just to make sure, I did all u suggested
> Pl  advise
> -Sagr
> On Apr 5, 2010, at 1:51 PM, Arvind Prabhakar wrote:
>
> Hi Sagar,
>> As a trial, I  am trying to setup hive for local DFS,MR mode
>
> You can do this as follows:
> 1. Set your HADOOP_HOME to the local hadoop installation. The configuration
> files - core-site.xml, mapred-site.xml, and hdfs-site.xml in
> $HADOOP_HOME/conf should be empty configurations with no properties
> specified.
>
> Using hadoop-0.18.1, so  it is only hadoop-site.xml. It is without ant
> properties
>
> 2. In your HIVE_HOME/conf directory - create a file called hive-site.xml and
> specify the javax.jdo.option.ConnectionURL
> as jdbc:derby:;databaseName=/path/to/your/metastore;create=true.
>
> done. It is also there hive-default.xml.
>
> Now you should be able to create the metastore database locally where you
> want and your dfs and map-reduce systems should run locally.
>
> It is my understanding that this should be the default setup anyway for both
> hadoop and hive - except for the custom connection URL for the metastore. If
> you were not to specify the custom connection URL, the metastore would be
> created in ${PWD}/metastore_db, which is also local/embedded.
>
> Arvind
> On Mon, Apr 5, 2010 at 11:55 AM, Sagar Naik <[email protected]> wrote:
>>
>> Hi
>> I tried to setup in embedded mode, the easiest one :)
>> Still no luck
>>
>> <property>
>>  <name>mapred.reduce.tasks</name>
>>  <value>local</value>
>>    <description>The default number of reduce tasks per job.  Typically set
>>  to a prime close to the number of available hosts.  Ignored when
>>  mapred.job.tracker is "local". Hadoop set this to 1 by default, whereas
>> hive uses -1 as its default value.
>>  By setting this property to -1, Hive will automatically figure out what
>> should be the number of reducers.
>>  </description>
>> </property>
>>
>> <property>
>>  <name>fs.default.name</name>
>>  <value>namenode:54310</value>
>> </property>
>>
>> <property>
>>  <name>javax.jdo.option.ConnectionURL</name>
>>
>>  <value>jdbc:derby:;databaseName=/data/hive/hive_metastore_db;create=true</value>
>> </property>
>>
>>
>> <property>
>>  <name>javax.jdo.option.ConnectionDriverName</name>
>>  <value>org.apache.derby.jdbc.EmbeddedDriver</value>
>> </property>
>>
>>
>> <property>
>>  <name>hive.metastore.warehouse.dir</name>
>>  <value>file:///data/hive/warehouse</value>
>> </property>
>>
>>
>> <property>
>>  <name>hive.metastore.local</name>
>>  <value>true</value>
>> </property>
>>
>> I made sure tht hive-site.xml is in  classpath
>>
>>
>> bin/hive
>> hive-log4j.properties not found
>> Hive history file=/tmp/argus/hive_job_log_argus_201004051154_330230103.txt
>> 10/04/05 11:54:06 [main] INFO exec.HiveHistory: Hive history
>> file=/tmp/argus/hive_job_log_argus_201004051154_330230103.txt
>> hive> CREATE TABLE pokes (foo INT, bar STRING);
>> 10/04/05 11:54:10 [main] INFO parse.ParseDriver: Parsing command: CREATE
>> TABLE pokes (foo INT, bar STRING)
>> 10/04/05 11:54:10 [main] INFO parse.ParseDriver: Parse Completed
>> 10/04/05 11:54:10 [main] INFO parse.SemanticAnalyzer: Starting Semantic
>> Analysis
>> 10/04/05 11:54:10 [main] INFO parse.SemanticAnalyzer: Creating tablepokes
>> positin=13
>> 10/04/05 11:54:10 [main] INFO ql.Driver: Semantic Analysis Completed
>> 10/04/05 11:54:10 [main] INFO ql.Driver: Starting command: CREATE TABLE
>> pokes (foo INT, bar STRING)
>> 10/04/05 11:54:10 [main] INFO exec.DDLTask: Default to LazySimpleSerDe for
>> table pokes
>> 10/04/05 11:54:10 [main] INFO hive.log: DDL: struct pokes { i32 foo,
>> string bar}
>> FAILED: Error in metadata: java.lang.IllegalArgumentException: URI:  does
>> not have a scheme
>> 10/04/05 11:54:11 [main] ERROR exec.DDLTask: FAILED: Error in metadata:
>> java.lang.IllegalArgumentException: URI:  does not have a scheme
>> org.apache.hadoop.hive.ql.metadata.HiveException:
>> java.lang.IllegalArgumentException: URI:  does not have a scheme
>>        at
>> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:281)
>>        at
>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1281)
>>        at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:119)
>>        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
>>        at
>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:64)
>>        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:582)
>>        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:462)
>>        at org.apache.hadoop.hive.ql.Driver.runCommand(Driver.java:324)
>>        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:312)
>>        at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
>>        at
>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
>>        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
>>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>        at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>        at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>        at java.lang.reflect.Method.invoke(Method.java:597)
>>        at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>>        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>>        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>> Caused by: java.lang.IllegalArgumentException: URI:  does not have a
>> scheme
>>        at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:92)
>>        at
>> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:828)
>>        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:838)
>>        at
>> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:275)
>>        ... 20 more
>>
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>> 10/04/05 11:54:11 [main] ERROR ql.Driver: FAILED: Execution Error, return
>> code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
>>
>>
>> On Apr 5, 2010, at 1:02 AM, Zheng Shao wrote:
>>
>> > See http://wiki.apache.org/hadoop/Hive/AdminManual/MetastoreAdmin for
>> > details.
>> >
>> > Zheng
>> >
>> > On Mon, Apr 5, 2010 at 12:01 AM, Sagar Naik <[email protected]>
>> > wrote:
>> >> Hi
>> >> As a trial, I  am trying to setup hive for local DFS,MR mode
>> >> I have set
>> >> <property>
>> >>  <name>hive.metastore.uris</name>
>> >>  <value>file:///data/hive/metastore/metadb</value>
>> >>  <description>The location of filestore metadata base dir</description>
>> >> </property>
>> >>
>> >> in hive-site.xml
>> >>
>> >> But I m still getting the following error
>> >>
>> >> Pl help me in getting hive up and running
>> >>
>> >>
>> >>
>> >>
>> >> CREATE TABLE pokes (foo INT, bar STRING);
>> >> 10/04/04 23:58:08 [main] INFO parse.ParseDriver: Parsing command:
>> >> CREATE TABLE pokes (foo INT, bar STRING)
>> >> 10/04/04 23:58:08 [main] INFO parse.ParseDriver: Parse Completed
>> >> 10/04/04 23:58:08 [main] INFO parse.SemanticAnalyzer: Starting Semantic
>> >> Analysis
>> >> 10/04/04 23:58:08 [main] INFO parse.SemanticAnalyzer: Creating
>> >> tablepokes positin=13
>> >> 10/04/04 23:58:08 [main] INFO ql.Driver: Semantic Analysis Completed
>> >> 10/04/04 23:58:08 [main] INFO ql.Driver: Starting command: CREATE TABLE
>> >> pokes (foo INT, bar STRING)
>> >> 10/04/04 23:58:08 [main] INFO exec.DDLTask: Default to LazySimpleSerDe
>> >> for table pokes
>> >> 10/04/04 23:58:08 [main] INFO hive.log: DDL: struct pokes { i32 foo,
>> >> string bar}
>> >> FAILED: Error in metadata: java.lang.IllegalArgumentException: URI:
>> >>  does not have a scheme
>> >> 10/04/04 23:58:08 [main] ERROR exec.DDLTask: FAILED: Error in metadata:
>> >> java.lang.IllegalArgumentException: URI:  does not have a scheme
>> >> org.apache.hadoop.hive.ql.metadata.HiveException:
>> >> java.lang.IllegalArgumentException: URI:  does not have a scheme
>> >>        at
>> >> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:281)
>> >>        at
>> >> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:1281)
>> >>        at
>> >> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:119)
>> >>        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:99)
>> >>        at
>> >> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:64)
>> >>        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:582)
>> >>        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:462)
>> >>        at org.apache.hadoop.hive.ql.Driver.runCommand(Driver.java:324)
>> >>        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:312)
>> >>        at
>> >> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
>> >>        at
>> >> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
>> >>        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
>> >>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>        at
>> >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>        at
>> >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>        at java.lang.reflect.Method.invoke(Method.java:597)
>> >>        at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>> >>        at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>> >>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> >>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>> >>        at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>> >> Caused by: java.lang.IllegalArgumentException: URI:  does not have a
>> >> scheme
>> >>        at
>> >> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:92)
>> >>        at
>> >> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:828)
>> >>        at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:838)
>> >>        at
>> >> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:275)
>> >>        ... 20 more
>> >>
>> >> FAILED: Execution Error, return code 1 from
>> >> org.apache.hadoop.hive.ql.exec.DDLTask
>> >>
>> >>
>> >
>> >
>> >
>> > --
>> > Yours,
>> > Zheng
>>
>
>
>

Sorry for the late reply here, but I just ran into the same problem. I
will explain this problem in my context. I am writing an app that
needs hive in a test case. I have added several hive jars ,
meta-store, hive-exec.

This error message almost always indicates that hive-default.xml
hive-site.xml are not in your classpath. In my eclipse setup built by
maven's meclipse, files not *.java are excluded from the source
classpath, however resources/*.xml is acceptable.

I had to put the xml files into test/resources.
Good luck,
Edward

Reply via email to