ps: I use Hive 0.12 release and Hadoop 1.2.1 release.

On Jun 17, 2014, at 5:07 PM, Fastupload <fastupl...@outlook.com> wrote:

> Hi,
> 
> When MapReduce code read RC file using HCatalog, and hive meta store in a 
> remote Oracle database. I write a demo code following by the wiki page, 
> https://cwiki.apache.org/confluence/display/Hive/HCatalog+InputOutput, and 
> package all dependence jar into one jar.
> 
> The job got the error while running. some line of failure stack is:
> Caused by: org.datanucleus.exceptions.NucleusUserException: Persistence 
> process has been specified to use a ClassLoaderResolver of name "datanucleus" 
> yet this has not been found by the DataNucleus plugin mechanism. Please check 
> your CLASSPATH and plugin specification.
> 
> It seems that the HCatInputFormat class can not create a 
> JDOPersisitencemanagerFactory object for HiveMetaStoreClient object in 
> InitializeInput.java from line 101 to 106,
> if (conf != null) {
>         hiveConf = HCatUtil.getHiveConf(conf);
>       } else {
>         hiveConf = new HiveConf(HCatInputFormat.class);
>       }
>       client = HCatUtil.getHiveClient(hiveConf);
> 
> The lines of code create a HiveMetaStoreClient object with job configuration 
> or HCatInputFormat properties. So I add code to add hive-site.xml file both 
> job configuration and HCatInputFormat properties. as,
> 
>               // load hive meta store configuration file, both properties and 
> job config
>               Properties prop = new Properties();
>               FileInputStream confStream = new FileInputStream(args[4]);
>               prop.load(confStream);
>               conf.addResource(confStream);
>               HCatInputFormat.setInput(job, dbName, 
> tblName).setFilter(filter).setProperties(prop);
> 
> But the job still get the same error.  any idea?  
> More error log and code please look at the two attachments.
> <error log.txt>
> <code fragment.txt>
> 
> Best Regards,
> Link Qian

Reply via email to