-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33104/
-----------------------------------------------------------

Review request for Sqoop.


Bugs: SQOOP-2295
    https://issues.apache.org/jira/browse/SQOOP-2295


Repository: sqoop-trunk


Description
-------

Currently, an existing dataset will throw an exception. This differs from 
`--as-textfile`. I've checked the user manual, the handling of HDFS and Hive 
are indeed different. For HDFS, unless `--append` is specified, the job will 
fail when destination exists already. For Hive, unless `--create-hive-table` is 
specified, the job will become append mode. The patch has made the handling of 
`--as-textfile` and `--as-parquetfile` consistent. (Note that `--as-avrofile` 
is not supported, which can be supported similias as parquet in follow-up jira.


Diffs
-----

  src/docs/man/hive-args.txt 7d9e427 
  src/docs/man/sqoop-create-hive-table.txt 7aebcc1 
  src/docs/user/create-hive-table.txt 3aa34fd 
  src/docs/user/hive-args.txt 53de92d 
  src/java/org/apache/sqoop/mapreduce/DataDrivenImportJob.java e70d23c 
  src/java/org/apache/sqoop/mapreduce/ParquetJob.java df55dbc 

Diff: https://reviews.apache.org/r/33104/diff/


Testing
-------

Manually tested append, new create and overwrite cases.


Thanks,

Qian Xu

Reply via email to