Hello, Venkat,
I am sorry for my previous reply.
I used --create-hcatalog-table argument.

I got another error
*16/08/17 05:53:25 ERROR tool.ImportTool: Encountered IOException running
import job: java.io.IOException: HCat exited with status 64*
*        at
org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.executeExternalHCatProgram(SqoopHCatUtilities.java:1129)*
*        at
org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.launchHCatCli(SqoopHCatUtilities.java:1078)*
*        at
org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.createHCatTable(SqoopHCatUtilities.java:625)*
*        at
org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:340)*
*        at
org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:783)*
*        at
org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)*
*        at
org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)*
*        at
org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)*
*        at
org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)*
*        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)*
*        at org.apache.sqoop.Sqoop.run(Sqoop.java:143)*
*        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)*
*        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)*
*        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)*
*        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)*
*        at org.apache.sqoop.Sqoop.main(Sqoop.java:236)*


On Wed, Aug 17, 2016 at 8:49 AM, Mahebub Sayyed <[email protected]>
wrote:

> Hello Venkat,
> There is no argument  as "–create-hcatalog-table"  in import command.
> I got following error:
>
> *16/08/17 05:46:14 WARN tool.BaseSqoopTool: Setting your password on the
> command-line is insecure. Consider using -P instead.*
> *16/08/17 05:46:14 ERROR tool.BaseSqoopTool: Error parsing arguments for
> import:*
> *16/08/17 05:46:14 ERROR tool.BaseSqoopTool: Unrecognized argument:
> –create-hcatalog-table*
> *16/08/17 05:46:14 ERROR tool.BaseSqoopTool: Unrecognized argument:
> --hcatalog-table*
> *16/08/17 05:46:14 ERROR tool.BaseSqoopTool: Unrecognized argument: user1*
>
>
> On Tue, Aug 16, 2016 at 5:05 PM, Venkat Ranganathan <
> [email protected]> wrote:
>
>> You don’t need feleds and lines terminated options with hcat table
>> option.   The NosuchTable error shows that the table does not exist.
>> Either use –create-hcatalog-table option or pre-create the table
>>
>>
>>
>> Venkat
>>
>>
>>
>> *From: *Mahebub Sayyed <[email protected]>
>> *Reply-To: *"[email protected]" <[email protected]>
>> *Date: *Tuesday, August 16, 2016 at 4:07 AM
>> *To: *"[email protected]" <[email protected]>
>> *Subject: *Re: how to create multi level partition in hive using sqoop
>>
>>
>>
>> Hello Boglarka and Markus,
>>
>>
>>
>> Thanks for reply.
>>
>> This is my sqoop command
>>
>> sqoop import --connect jdbc:postgresql://localhost:7432/test_db \
>>
>>   --driver org.postgresql.Driver --username pgadmin --password pgadmin@1234 \
>>
>>   --table user1  \
>>
>>   --fields-terminated-by '\001' \
>>
>>   --lines-terminated-by '\012' \
>>
>>   --hcatalog-database test \
>>
>>   --hcatalog-table user1 \
>>
>>   --hcatalog-partition-keys year,month,day \
>>
>>   --hcatalog-partition-values '2016,08,15' \
>>
>>   --verbose
>>
>> But I getting Error:
>>
>> ERROR tool.ImportTool: Encountered IOException running import job: 
>> java.io.IOException: NoSuchObjectException(message:test.user1 table not 
>> found)
>>
>>         at 
>> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:97)
>>
>>         at 
>> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
>>
>>         at 
>> org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:343)
>>
>>         at 
>> org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:783)
>>
>>         at 
>> org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
>>
>>         at 
>> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
>>
>>         at 
>> org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
>>
>>         at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
>>
>>         at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
>>
>>         at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
>>
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>
>>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
>>
>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
>>
>>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
>>
>>         at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
>>
>> Caused by: NoSuchObjectException(message:test.user1 table not found)
>>
>>         at 
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:34980)
>>
>>         at 
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:34948)
>>
>>         at 
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:34879)
>>
>>         at 
>> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>
>>         at 
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1214)
>>
>>         at 
>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1200)
>>
>>         at 
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1201)
>>
>>         at 
>> org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:180)
>>
>>         at 
>> org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:105)
>>
>>         at 
>> org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
>>
>>         at 
>> org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
>>
>>         ... 14 more
>>
>> On Mon, Aug 15, 2016 at 4:35 PM, Markus Kemper <[email protected]>
>> wrote:
>>
>> Hello Mahebub,
>>
>>
>>
>> Bogi is correct and great answer btw.
>>
>>
>>
>> To the best of my knowledge, with Sqoop and Hive Partitioning the
>> following rules apply:
>>
>> 1. With static partitions you can use either (--hive-import or --hcatalog
>> options)
>>
>> 2. With dynamic partitions you can only use (--hcatalog options)
>>
>>
>>
>> Example (static):
>>
>> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD
>> --table t1 --columns c1,c2 --where "p1 = 1" --num-mappers 1 --hive-import
>> --hive-database default --hive-table t1_partition --hive-partition-key
>> <col> --hive-partition-value <value>
>>
>>
>>
>> Example (dynamic):
>>
>> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD
>> --table t1 --hcatalog-database default --hcatalog-table t1_partitioned
>> --num-mappers 1 --verbose --where "c1 > 1" --hive-partition-value <col>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> Markus Kemper
>> Customer Operations Engineer
>> [image: www.cloudera.com] <http://www.cloudera.com>
>>
>>
>>
>>
>>
>> On Mon, Aug 15, 2016 at 9:19 AM, Boglarka Egyed <[email protected]>
>> wrote:
>>
>> Hi Mahebub,
>>
>>
>>
>> Unfortunatelly, using *--hive-partition-key* and *--hive-partition-value*
>> requires each Sqoop statement to be imported into a single Hive partition.
>> There is currently no support for Hive auto-partitioning. Instead, if a
>> data set is to be imported into multiple partitions in a table, separate
>> Sqoop statements are needed for insertion into each partition.
>>
>>
>>
>> However, using *--hcatalog-partition-keys *and
>> *--hcatalog-partition-values *you can specify multiple static partition
>> key/value pairs, please find the details in the User Guide:
>> https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.ht
>> ml#_sqoop_hcatalog_integration
>>
>> Best Regards,
>> Bogi
>>
>>
>>
>> On Mon, Aug 15, 2016 at 9:51 AM, Mahebub Sayyed <[email protected]>
>> wrote:
>>
>> I need to create/import  hive table having three Partitions
>> year/month/day using Sqoop. I have checked *--hive-partition-key* and
>> *--hive-partition-value* in sqoop. using these parameters I have created
>> partition *year* like this --hive-partition-key year
>> --hive-partition-value '2016' My question is how to pass multiple values
>> for partition-key and partition-value to create partitions like
>> year/month/day.
>>
>>
>>
>> --
>>
>> *Regards,*
>>
>> *Mahebub Sayyed*
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> --
>>
>> *Regards,*
>>
>> *Mahebub Sayyed*
>>
>
>
>
> --
> *Regards,*
> *Mahebub Sayyed*
>



-- 
*Regards,*
*Mahebub Sayyed*

Reply via email to