I remember that someone had already posted an similar issue.

I've got a issue when i try to import data from RDBMS to HDFS so i decide to
bring data into hdfs first (with sqoop and textfile import option), then
use a LOAD DATA with Hive.

--
Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
<http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>

2015-07-21 11:47 GMT+02:00 Venkat Ramachandran <[email protected]>:

> Makes sense. Let me try
>
> On Tuesday, July 21, 2015, Shwetha Shivalingamurthy <
> [email protected]> wrote:
>
> > Hive-site.xml should be in the class path like
> >
> https://github.com/apache/oozie/blob/master/examples/src/main/apps/hcatalog
> > /workflow.xml
> >
> > Alternatively, you can probably add it to hive sharelib, I am not sure
> >
> > -Shwetha
> >
> > On 21/07/15 1:57 pm, "Venkatesan Ramachandran" <[email protected]
> > <javascript:;>>
> > wrote:
> >
> > >Thanks Shwetha. Adding hcatalog,hive,hive2 to
> > >oozie.action.sharelib.for.sqoop
> > >in job configuration makes it move beyond the class not found exception.
> > >
> > >However, now Sqoop throws *java.io.IOException:
> > >NoSuchObjectException(message:default.simple table not found) *even
> though
> > >I have manually created the table in default.simple.
> > >
> > >How does the Sqoop action (or oozie) know how to talk to HCAT/metastore
> > >server? is there some other config I'm missing:
> > >
> > >
> > >
> > >*Sqoop action workflow.xml*<workflow-app xmlns="uri:oozie:workflow:0.2"
> > >name="sqoop-wf">
> > >    <start to="sqoop-node"/>
> > >    <action name="sqoop-node">
> > >        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
> > >            <job-tracker>${jobTracker}</job-tracker>
> > >            <name-node>${nameNode}</name-node>
> > >            <configuration>
> > >                <property>
> > >                    <name>mapred.job.queue.name</name>
> > >                    <value>${queueName}</value>
> > >                </property>
> > >            </configuration>
> > >        <command>
> > >        import --connect jdbc:mysql://c6402/test --table simple
> --username
> > >sqoop_user --password-file /user/ambari-qa/datastore/testdb_password
> > >--num-mappers 2 --split-by id --hcatalog-database default
> --hcatalog-table
> > >simple
> > >          </command>
> > >        </sqoop>
> > >        <ok to="end"/>
> > >        <error to="fail"/>
> > >    </action>
> > >
> > >    <kill name="fail">
> > >        <message>Sqoop failed, error
> > >message[${wf:errorMessage(wf:lastErrorNode())}]</message>
> > >    </kill>
> > >    <end name="end"/>
> > ></workflow-app>
> > >
> > >*Exception*:
> > >
> > >Sqoop command arguments :
> > >             import
> > >             --connect
> > >             jdbc:mysql://c6402/test
> > >             --table
> > >             simple
> > >             --username
> > >             sqoop_user
> > >             --password-file
> > >             /user/ambari-qa/datastore/testdb_password
> > >             --num-mappers
> > >             2
> > >             --split-by
> > >             id
> > >             --hcatalog-database
> > >             default
> > >             --hcatalog-table
> > >             simple
> > >
> > >
> > >
> > >7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
> > >- Database column names projected : [id, name, value, modified_ts]
> > >2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
> > >(SqoopHCatUtilities.java:initDBColumnInfo(519)) - Database column
> > >names projected : [id, name, value, modified_ts]
> > >7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
> > >- Database column name - info map :
> > >       modified_ts : [Type : 93,Precision : 19,Scale : 0]
> > >       name : [Type : 12,Precision : 20,Scale : 0]
> > >       id : [Type : 4,Precision : 11,Scale : 0]
> > >       value : [Type : 4,Precision : 11,Scale : 0]
> > >
> > >2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
> > >(SqoopHCatUtilities.java:initDBColumnInfo(530)) - Database column name
> > >- info map :
> > >       modified_ts : [Type : 93,Precision : 19,Scale : 0]
> > >       name : [Type : 12,Precision : 20,Scale : 0]
> > >       id : [Type : 4,Precision : 11,Scale : 0]
> > >       value : [Type : 4,Precision : 11,Scale : 0]
> > >
> > >2015-07-21 08:18:44,244 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:newRawStore(589)) - 0: Opening raw store with
> > >implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> > >2015-07-21 08:18:44,286 INFO  [main] metastore.ObjectStore
> > >(ObjectStore.java:initialize(289)) - ObjectStore, initialize called
> > >2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
> > >(Log4JLogger.java:info(77)) - Property
> > >hive.metastore.integral.jdo.pushdown unknown - will be ignored
> > >2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
> > >(Log4JLogger.java:info(77)) - Property datanucleus.cache.level2
> > >unknown - will be ignored
> > >2015-07-21 08:18:49,025 INFO  [main] metastore.ObjectStore
> > >(ObjectStore.java:getPMF(370)) - Setting MetaStore object pin classes
> > >with
> >
> >hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partit
> > >ion,Database,Type,FieldSchema,Order"
> > >2015-07-21 08:18:51,339 INFO  [main] DataNucleus.Datastore
> > >(Log4JLogger.java:info(77)) - The class
> > >"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> > >"embedded-only" so does not have its own datastore table.
> > >2015-07-21 08:18:51,341 INFO  [main] DataNucleus.Datastore
> > >(Log4JLogger.java:info(77)) - The class
> > >"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> > >"embedded-only" so does not have its own datastore table.
> > >2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
> > >(Log4JLogger.java:info(77)) - The class
> > >"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> > >"embedded-only" so does not have its own datastore table.
> > >2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
> > >(Log4JLogger.java:info(77)) - The class
> > >"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> > >"embedded-only" so does not have its own datastore table.
> > >2015-07-21 08:18:54,116 INFO  [main] metastore.MetaStoreDirectSql
> > >(MetaStoreDirectSql.java:<init>(139)) - Using direct SQL, underlying
> > >DB is DERBY
> > >2015-07-21 08:18:54,120 INFO  [main] metastore.ObjectStore
> > >(ObjectStore.java:setConf(272)) - Initialized ObjectStore
> > >2015-07-21 08:18:54,303 WARN  [main] metastore.ObjectStore
> > >(ObjectStore.java:checkSchema(6658)) - Version information not found
> > >in metastore. hive.metastore.schema.verification is not enabled so
> > >recording the schema version 1.2.0
> > >2015-07-21 08:18:54,508 WARN  [main] metastore.ObjectStore
> > >(ObjectStore.java:getDatabase(568)) - Failed to get database default,
> > >returning NoSuchObjectException
> > >2015-07-21 08:18:54,694 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:createDefaultRoles_core(663)) - Added admin role
> > >in metastore
> > >2015-07-21 08:18:54,701 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:createDefaultRoles_core(672)) - Added public role
> > >in metastore
> > >2015-07-21 08:18:54,813 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:addAdminUsers_core(712)) - No user is added in
> > >admin role, since config is empty
> > >2015-07-21 08:18:54,980 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:logInfo(746)) - 0: get_databases:
> > >NonExistentDatabaseUsedForHealthCheck
> > >2015-07-21 08:18:54,981 INFO  [main] HiveMetaStore.audit
> > >(HiveMetaStore.java:logAuditEvent(371)) -
> > >ugi=ambari-qa  ip=unknown-ip-addr      cmd=get_databases:
> > >NonExistentDatabaseUsedForHealthCheck
> > >2015-07-21 08:18:55,019 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:logInfo(746)) - 0: get_table : db=default
> > >tbl=simple
> > >2015-07-21 08:18:55,022 INFO  [main] HiveMetaStore.audit
> > >(HiveMetaStore.java:logAuditEvent(371)) -
> > >ugi=ambari-qa  ip=unknown-ip-addr      cmd=get_table : db=default
> > tbl=simple
> > >18847 [main] ERROR org.apache.sqoop.tool.ImportTool  - Encountered
> > >IOException running import job: java.io.IOException:
> > >NoSuchObjectException(message:default.simple table not found)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
> > >t.java:97)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
> > >t.java:51)
> > >       at
> >
> >org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCat
> > >Utilities.java:343)
> > >       at
> >
> >org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFo
> > >rmat(SqoopHCatUtilities.java:783)
> > >       at
> >
> >org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBa
> > >se.java:98)
> > >       at
> >
> >org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
> > >       at
> > org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
> > >       at
> > >org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
> > >       at
> > org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
> > >       at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
> > >       at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
> > >       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > >       at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
> > >       at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
> > >       at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
> > >       at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
> > >       at
> > >org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
> > >       at
> org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
> > >       at
> > org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
> > >       at
> org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
> > >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >       at
> >
> >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> > >62)
> > >       at
> >
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
> > >pl.java:43)
> > >       at java.lang.reflect.Method.invoke(Method.java:497)
> > >       at
> >
> >org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
> > >       at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> > >       at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at
> >
> >org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.
> > >java:1657)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> > >Caused by: NoSuchObjectException(message:default.simple table not found)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(H
> > >iveMetaStore.java:1808)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMe
> > >taStore.java:1778)
> > >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >       at
> >
> >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> > >62)
> > >       at
> >
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
> > >pl.java:43)
> > >       at java.lang.reflect.Method.invoke(Method.java:497)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHand
> > >ler.java:107)
> > >       at com.sun.proxy.$Proxy20.get_table(Unknown Source)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStor
> > >eClient.java:1208)
> > >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >       at
> >
> >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> > >62)
> > >       at
> >
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
> > >pl.java:43)
> > >       at java.lang.reflect.Method.invoke(Method.java:497)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMe
> > >taStoreClient.java:152)
> > >       at com.sun.proxy.$Proxy21.getTable(Unknown Source)
> > >       at
> > org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:180)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(Initial
> > >izeInput.java:105)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInpu
> > >t.java:86)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
> > >t.java:95)
> > >       ... 32 more
> >
> >
>

Reply via email to