[ 
https://issues.apache.org/jira/browse/SQOOP-1393?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14133065#comment-14133065
 ] 

Richard commented on SQOOP-1393:
--------------------------------

There is no need to {{--hcatalog-storage-stanza "STORED AS PARQUET" 
--create-hcatalog-table --hcatalog-table abc2}}.
It has been tested successfully in hive 0.13.0. For hive 0.12.0, I will test it 
later and let you know the result.

> Import data from database to Hive as Parquet files
> --------------------------------------------------
>
>                 Key: SQOOP-1393
>                 URL: https://issues.apache.org/jira/browse/SQOOP-1393
>             Project: Sqoop
>          Issue Type: Sub-task
>          Components: tools
>            Reporter: Qian Xu
>            Assignee: Richard
>             Fix For: 1.4.6
>
>         Attachments: patch.diff, patch_v2.diff, patch_v3.diff
>
>
> Import data to Hive as Parquet file can be separated into two steps:
> 1. Import an individual table from an RDBMS to HDFS as a set of Parquet files.
> 2. Import the data into Hive by generating and executing a CREATE TABLE 
> statement to define the data's layout in Hive with Parquet format table



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to