No, I did not change anything.
and BTW, I sync the Hive from svn, but can not build it, the following is
the error message:
[ivy:retrieve] :: resolution report :: resolve 7120ms :: artifacts dl
454644ms
---------------------------------------------------------------------
| | modules || artifacts
|
| conf | number| search|dwnlded|evicted||
number|dwnlded|
---------------------------------------------------------------------
| default | 4 | 0 | 0 | 0 || 4 | 0
|
---------------------------------------------------------------------
[ivy:retrieve]
[ivy:retrieve] :: problems summary ::
[ivy:retrieve] :::: WARNINGS
[ivy:retrieve] [FAILED ]
hadoop#core;0.18.3!hadoop.tar.gz(source): Downloaded file size doesn't match
expected Content Length for
http://archive.apache.org/dist/hadoop/core/hadoop-0.18.3/hadoop-0.18.3.tar.gz.
Please retry. (154498ms)
[ivy:retrieve] [FAILED ]
hadoop#core;0.18.3!hadoop.tar.gz(source): (0ms)
[ivy:retrieve] ==== hadoop-source: tried
[ivy:retrieve]
http://archive.apache.org/dist/hadoop/core/hadoop-0.18.3/hadoop-0.18.3.tar.gz
[ivy:retrieve] ==== apache-snapshot: tried
[ivy:retrieve]
https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.18.3/hadoop-0.18.3.tar.gz
[ivy:retrieve] ==== maven2: tried
[ivy:retrieve]
http://repo1.maven.org/maven2/hadoop/core/0.18.3/core-0.18.3.tar.gz
[ivy:retrieve] [FAILED ]
hadoop#core;0.19.0!hadoop.tar.gz(source): Downloaded file size doesn't match
expected Content Length for
http://archive.apache.org/dist/hadoop/core/hadoop-0.19.0/hadoop-0.19.0.tar.gz.
Please retry. (153130ms)
[ivy:retrieve] [FAILED ]
hadoop#core;0.19.0!hadoop.tar.gz(source): (0ms)
[ivy:retrieve] ==== hadoop-source: tried
[ivy:retrieve]
http://archive.apache.org/dist/hadoop/core/hadoop-0.19.0/hadoop-0.19.0.tar.gz
[ivy:retrieve] ==== apache-snapshot: tried
[ivy:retrieve]
https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.19.0/hadoop-0.19.0.tar.gz
[ivy:retrieve] ==== maven2: tried
[ivy:retrieve]
http://repo1.maven.org/maven2/hadoop/core/0.19.0/core-0.19.0.tar.gz
[ivy:retrieve] [FAILED ]
hadoop#core;0.20.0!hadoop.tar.gz(source): Downloaded file size doesn't match
expected Content Length for
http://archive.apache.org/dist/hadoop/core/hadoop-0.20.0/hadoop-0.20.0.tar.gz.
Please retry. (147000ms)
[ivy:retrieve] [FAILED ]
hadoop#core;0.20.0!hadoop.tar.gz(source): (0ms)
[ivy:retrieve] ==== hadoop-source: tried
[ivy:retrieve]
http://archive.apache.org/dist/hadoop/core/hadoop-0.20.0/hadoop-0.20.0.tar.gz
[ivy:retrieve] ==== apache-snapshot: tried
[ivy:retrieve]
https://repository.apache.org/content/repositories/snapshots/hadoop/core/0.20.0/hadoop-0.20.0.tar.gz
[ivy:retrieve] ==== maven2: tried
[ivy:retrieve]
http://repo1.maven.org/maven2/hadoop/core/0.20.0/core-0.20.0.tar.gz
[ivy:retrieve] ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:retrieve] :: FAILED DOWNLOADS ::
[ivy:retrieve] :: ^ see resolution messages for details ^ ::
[ivy:retrieve] ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:retrieve] :: hadoop#core;0.18.3!hadoop.tar.gz(source)
[ivy:retrieve] :: hadoop#core;0.19.0!hadoop.tar.gz(source)
[ivy:retrieve] :: hadoop#core;0.20.0!hadoop.tar.gz(source)
[ivy:retrieve] ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:retrieve]
[ivy:retrieve] :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
BUILD FAILED
/root/Hive_trunk/build.xml:148: The following error occurred while executing
this line:
/root/Hive_trunk/build.xml:93: The following error occurred while executing
this line:
/root/Hive_trunk/shims/build.xml:55: The following error occurred while
executing this line:
/root/Hive_trunk/build-common.xml:173: impossible to resolve dependencies:
resolve failed - see output for details
On Tue, Jan 26, 2010 at 6:04 PM, Zheng Shao <[email protected]> wrote:
> This usually happens when there is a problem in the metastore
> configuration.
> Did you change any hive configurations?
>
>
> Zheng
>
> On Tue, Jan 26, 2010 at 1:41 AM, Jeff Zhang <[email protected]> wrote:
> > The following is the logs:
> >
> >
> > 2010-01-26 17:23:51,509 ERROR exec.DDLTask
> > (SessionState.java:printError(279)) - FAILED: Error in metadata:
> > javax.jdo.JDOFatalInternalException: Unexpected exception caught.
> > NestedThrowables:
> > java.lang.reflect.InvocationTargetException
> > org.apache.hadoop.hive.ql.metadata.HiveException:
> > javax.jdo.JDOFatalInternalException: Unexpected exception caught.
> > NestedThrowables:
> > java.lang.reflect.InvocationTargetException
> > at
> > org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:400)
> > at
> > org.apache.hadoop.hive.ql.metadata.Hive.getAllTables(Hive.java:387)
> > at
> > org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:352)
> > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:143)
> > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:379) at
> > org.apache.hadoop.hive.ql.Driver.run(Driver.java:285) at
> > org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:123)
> > at
> > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:181)
> > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:287)
> at
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597) at
> > org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> > Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception
> caught.
> > NestedThrowables:
> > java.lang.reflect.InvocationTargetException
> > at
> >
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1186)
> > at
> > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:803)
> > at
> > javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:698)
> > at
> > org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:161)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:178)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:122)
> > at
> >
> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:101)
> > at
> > org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
> > at
> >
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:130)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:146)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:118)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:100)
> > at
> >
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:74)
> > at
> >
> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:783)
> > at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:794)
> at
> > org.apache.hadoop.hive.ql.metadata.Hive.getTablesByPattern(Hive.java:398)
> > ... 13 more
> > Caused by: java.lang.reflect.InvocationTargetException at
> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597) at
> > javax.jdo.JDOHelper$16.run(JDOHelper.java:1956) at
> > java.security.AccessController.doPrivileged(Native Method) at
> > javax.jdo.JDOHelper.invoke(JDOHelper.java:1951) at
> >
> javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1159)
> > ... 29 more
> >
> >
> > On Tue, Jan 26, 2010 at 4:52 PM, Zheng Shao <[email protected]> wrote:
> >>
> >> Can you post the traces in /tmp/<user>/hive.log?
> >>
> >> Zheng
> >>
> >> On Tue, Jan 26, 2010 at 12:40 AM, Jeff Zhang <[email protected]> wrote:
> >> > Hi all,
> >> >
> >> > I follow the get started wiki page, but I use the hive 0.4.1 release
> >> > version
> >> > rather than svn trunk. And when I invoke command: show tables;
> >> > It shows the following error message, anyone has encounter this
> problem
> >> > before ?
> >> >
> >> > hive> show tables;
> >> > FAILED: Error in metadata: javax.jdo.JDOFatalInternalException:
> >> > Unexpected
> >> > exception caught.
> >> > NestedThrowables:
> >> > java.lang.reflect.InvocationTargetException
> >> > FAILED: Execution Error, return code 1 from
> >> > org.apache.hadoop.hive.ql.exec.DDLTask
> >> >
> >> >
> >> > --
> >> > Best Regards
> >> >
> >> > Jeff Zhang
> >> >
> >>
> >>
> >>
> >> --
> >> Yours,
> >> Zheng
> >
> >
> >
> > --
> > Best Regards
> >
> > Jeff Zhang
> >
>
>
>
> --
> Yours,
> Zheng
>
--
Best Regards
Jeff Zhang