-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33104/
-----------------------------------------------------------
(Updated May 6, 2015, 1:49 p.m.)
Review request for Sqoop.
Changes
-------
The new patch has two changes (1) makes sure table directory is cleaned up in
setup stage for every test case (2) make sure records verification stable
Bugs: SQOOP-2295
https://issues.apache.org/jira/browse/SQOOP-2295
Repository: sqoop-trunk
Description
-------
Currently, an existing dataset will throw an exception. This differs from
`--as-textfile`. I've checked the user manual, the handling of HDFS and Hive
are indeed different. For HDFS, unless `--append` is specified, the job will
fail when destination exists already. For Hive, unless `--create-hive-table` is
specified, the job will become append mode. The patch has made the handling of
`--as-textfile` and `--as-parquetfile` consistent.
Diffs (updated)
-----
src/docs/man/hive-args.txt 7d9e427
src/docs/man/sqoop-create-hive-table.txt 7aebcc1
src/docs/user/create-hive-table.txt 3aa34fd
src/docs/user/hive-args.txt 53de92d
src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java d5bfae2
src/java/org/apache/sqoop/mapreduce/ParquetJob.java df55dbc
src/test/com/cloudera/sqoop/TestParquetImport.java 07e140a
src/test/com/cloudera/sqoop/hive/TestHiveImport.java fa717cb
src/test/com/cloudera/sqoop/testutil/BaseSqoopTestCase.java 7934791
src/test/com/cloudera/sqoop/testutil/ImportJobTestCase.java 293bf10
testdata/hive/scripts/normalImportAsParquet.q e434e9b
Diff: https://reviews.apache.org/r/33104/diff/
Testing
-------
Manually tested append, new create and overwrite cases.
Thanks,
Qian Xu