[
https://issues.apache.org/jira/browse/HADOOP-5887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12712526#action_12712526
]
Hadoop QA commented on HADOOP-5887:
-----------------------------------
-1 overall. Here are the results of testing the latest attachment
http://issues.apache.org/jira/secure/attachment/12408740/HADOOP-5887.patch
against trunk revision 777761.
+1 @author. The patch does not contain any @author tags.
+1 tests included. The patch appears to include 19 new or modified tests.
-1 patch. The patch command could not apply the patch.
Console output:
http://hudson.zones.apache.org/hudson/job/Hadoop-Patch-vesta.apache.org/391/console
This message is automatically generated.
> Sqoop should create tables in Hive metastore after importing to HDFS
> --------------------------------------------------------------------
>
> Key: HADOOP-5887
> URL: https://issues.apache.org/jira/browse/HADOOP-5887
> Project: Hadoop Core
> Issue Type: New Feature
> Reporter: Aaron Kimball
> Assignee: Aaron Kimball
> Attachments: HADOOP-5887.patch
>
>
> Sqoop (HADOOP-5815) imports tables into HDFS; it is a straightforward
> enhancement to then generate a Hive DDL statement to recreate the table
> definition in the Hive metastore and move the imported table into the Hive
> warehouse directory from its upload target.
> This feature enhancement makes this process automatic. An import is performed
> with sqoop in the usual way; providing the argument "--hive-import" will
> cause it to then issue a CREATE TABLE .. LOAD DATA INTO statement to a Hive
> shell. It generates a script file and then attempts to run
> "$HIVE_HOME/bin/hive" on it, or failing that, any "hive" on the $PATH;
> $HIVE_HOME can be overridden with --hive-home. As a result, no direct linking
> against Hive is necessary.
> The unit tests provided with this enhancement use a mock implementation of
> 'bin/hive' that compares the script it's fed with one from a directory full
> of "expected" scripts. The exact script file referenced is controlled via an
> environment variable. It doesn't actually load into a proper Hive metastore,
> but manual testing has shown that this process works in practice, so the mock
> implementation is a reasonable unit testing tool.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.