Repository: sqoop
Updated Branches:
  refs/heads/trunk 40f0b74c0 -> c329f360d


SQOOP-3384: Document import into external Hive table backed by S3

(Boglarka Egyed via Szabolcs Vasas)


Project: http://git-wip-us.apache.org/repos/asf/sqoop/repo
Commit: http://git-wip-us.apache.org/repos/asf/sqoop/commit/c329f360
Tree: http://git-wip-us.apache.org/repos/asf/sqoop/tree/c329f360
Diff: http://git-wip-us.apache.org/repos/asf/sqoop/diff/c329f360

Branch: refs/heads/trunk
Commit: c329f360dd08ef3b9bd82897fcd611e7431d32c8
Parents: 40f0b74
Author: Szabolcs Vasas <[email protected]>
Authored: Mon Oct 15 15:32:39 2018 +0200
Committer: Szabolcs Vasas <[email protected]>
Committed: Mon Oct 15 15:32:39 2018 +0200

----------------------------------------------------------------------
 src/docs/user/s3.txt | 44 ++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 44 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/sqoop/blob/c329f360/src/docs/user/s3.txt
----------------------------------------------------------------------
diff --git a/src/docs/user/s3.txt b/src/docs/user/s3.txt
index 3724454..c54b26b 100644
--- a/src/docs/user/s3.txt
+++ b/src/docs/user/s3.txt
@@ -118,3 +118,47 @@ $ sqoop import \
 ----
 
 Data from RDBMS can be imported into S3 in incremental +lastmodified+ mode as 
Parquet file format too.
+
+Import Into External Hive Table Backed By S3
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+To import data from RDBMS into an external Hive table backed by S3 the AWS 
credentials have to be set in the Hive
+configuration file (+hive-site.xml+) too. For learning more about Hive on 
Amazon Web Services please see the Hive
+documentation at https://cwiki.apache.org/confluence/display/Hive/HiveAws.
+
+The current implementation of Sqoop requires that both +target-dir+ and 
+external-table-dir+ options are set
+where +external-table-dir+ has to point to the Hive table location in the S3 
bucket.
+
+Import into an external Hive table backed by S3 for example:
+
+----
+$ sqoop import \
+  -Dfs.s3a.access.key=$AWS_ACCES_KEY \
+  -Dfs.s3a.secret.key=$AWS_SECRET_KEY \
+  --connect $CONN \
+  --username $USER \
+  --password $PWD \
+  --table $TABLE_NAME \
+  --hive-import \
+  --target-dir s3a://example-bucket/target-directory \
+  --external-table-dir s3a://example-bucket/external-directory
+----
+
+Create an external Hive table backed by S3 for example:
+
+----
+$ sqoop import \
+  -Dfs.s3a.access.key=$AWS_ACCES_KEY \
+  -Dfs.s3a.secret.key=$AWS_SECRET_KEY \
+  --connect $CONN \
+  --username $USER \
+  --password $PWD \
+  --table $TABLE_NAME \
+  --hive-import \
+  --create-hive-table \
+  --hive-table $HIVE_TABLE_NAME \
+  --target-dir s3a://example-bucket/target-directory \
+  --external-table-dir s3a://example-bucket/external-directory
+----
+
+Data from RDBMS can be imported into an external Hive table backed by S3 as 
Parquet file format too.
\ No newline at end of file

Reply via email to