Re: Review Request 57551: SQOOP-2272 - Import decimal columns from mysql to hive 0.14

2017-07-02 Thread Eric Lin

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/57551/
---

(Updated July 3, 2017, 5:36 a.m.)


Review request for Sqoop, Attila Szabo and Szabolcs Vasas.


Changes
---

Uploading new patch based on latest trunk code for review. Please help to have 
a look and hopefully resolve the JIRA. Thanks a lot for your help.


Repository: sqoop-trunk


Description
---

Currently Sqoop converts DECIMAL from RDMS into DOUBLE in Hive, which is not 
correct as user will lose precisions. Since Hive supports DECIMAL long ago, we 
should support DECIMAL to DECIMAL conversion from Sqoop to Hive.


Diffs (updated)
-

  src/java/org/apache/sqoop/hive/HiveTypes.java ad00535 
  src/java/org/apache/sqoop/hive/TableDefWriter.java deec32d 
  src/test/com/cloudera/sqoop/hive/TestHiveImport.java a624f52 
  src/test/com/cloudera/sqoop/hive/TestTableDefWriter.java dbf0dde 
  testdata/hive/scripts/decimalImport.q PRE-CREATION 


Diff: https://reviews.apache.org/r/57551/diff/6/

Changes: https://reviews.apache.org/r/57551/diff/5-6/


Testing
---

Test case + maunaul test


Thanks,

Eric Lin



[jira] [Resolved] (SQOOP-3150) issue with sqoop hive import with partitions

2017-07-02 Thread Eric Lin (JIRA)

 [ 
https://issues.apache.org/jira/browse/SQOOP-3150?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Lin resolved SQOOP-3150.
-
   Resolution: Not A Bug
Fix Version/s: no-release

This is not a bug, resolving it.

> issue with sqoop hive import with partitions
> 
>
> Key: SQOOP-3150
> URL: https://issues.apache.org/jira/browse/SQOOP-3150
> Project: Sqoop
>  Issue Type: Bug
>  Components: hive-integration
>Affects Versions: 1.4.6
> Environment: Cent-Os
>Reporter: Ankit Kumar
>Assignee: Eric Lin
>  Labels: features
> Fix For: no-release
>
>
> Sqoop Command:
>   sqoop import \
>   ...
>   --hive-import  \
>   --hive-overwrite  \
>   --hive-table employees_p  \
>   --hive-partition-key date  \
>   --hive-partition-value 10-03-2017  \
>   --target-dir ..\
>   -m 1  
>   
>   hive-table script:
>   employees_p is a partitioned table on date(string) column
>   
>   Issue:- 
>   Case1: When  --target-dir 
> /user/hdfs/landing/staging/Hive/partitioned/EMPLOYEES \
>   while running above sqoop command, gets an error "directory already 
> exissts".
>   
>   When : --target-dir 
> /user/hdfs/landing/staging/Hive/partitioned/EMPLOYEES/anyname 
>   2. Above sqoop command creates a hive partition (date=10-03-2017) and 
> directory as
>   '/user/hdfs/landing/staging/Hive/partitioned/EMPLOYEES/date=10-03-2017'
>   
> Expected Behaviour:- As in sqoop command  --hive-partition-key and  
> --hive-partition-value is present, so it should auto create partioned 
> directory inside EMPLOYEES.
> ie. '/user/hdfs/landing/staging/Hive/partitioned/EMPLOYEES/date=10-03-2017'



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (SQOOP-2272) Import decimal columns from mysql to hive 0.14

2017-07-02 Thread Eric Lin (JIRA)

 [ 
https://issues.apache.org/jira/browse/SQOOP-2272?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Lin updated SQOOP-2272:

Attachment: SQOOP-2272.5.patch

Attaching latest patch based on latest trunk code changes.

> Import decimal columns from mysql to hive 0.14
> --
>
> Key: SQOOP-2272
> URL: https://issues.apache.org/jira/browse/SQOOP-2272
> Project: Sqoop
>  Issue Type: Bug
>  Components: sqoop2-shell
>Affects Versions: 1.4.5
>Reporter: Pawan Pawar
>Assignee: Eric Lin
> Attachments: SQOOP-2272.2.patch, SQOOP-2272.3.patch, 
> SQOOP-2272.4.patch, SQOOP-2272.5.patch, SQOOP-2272.patch
>
>
> I am importing data from mysql to hive. several columns in source table are 
> of type decimal. but sqoop convert this types into the double. How can I 
> import that table with same precision and scale in hive.
> My query is:
> sqoop import --connect 
> jdbc:mysql://localhost:3306/SourceDataBase?zeroDateTimeBehavior=convertToNull 
> --username root --password root --hive-table MyHiveDatabaseName.MyTableName 
> --hive-import  --hive-table MyHiveDatabaseName.MyTableName --query 'select * 
> from MyTableName where $CONDITIONS' -m 1 --target-dir 
> /user/hive/warehouse/MyHiveDatabaseName/MyTableName 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)