[jira] [Commented] (SQOOP-3173) support DB2 xml data type when sqoop import with parquet

2017-04-17 Thread Ying Cao (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15972076#comment-15972076
 ] 

Ying Cao commented on SQOOP-3173:
-

[~maugli] could you kindly  review SQOOP-3173.patch I uploaded?

> support DB2 xml data type when sqoop import with parquet
> 
>
> Key: SQOOP-3173
> URL: https://issues.apache.org/jira/browse/SQOOP-3173
> Project: Sqoop
>  Issue Type: Bug
>  Components: connectors
>Affects Versions: 1.4.6
> Environment: RedHat6.4+Sqoop 1.4.6 +Hadoop 2.7.3
>Reporter: Ying Cao
>Assignee: Ying Cao
>Priority: Minor
>  Labels: patch
> Fix For: 1.4.7
>
> Attachments: SQOOP-3173.patch
>
>
> When sqoop run import  job with parquet from db2, the xml data type failed 
> with :
> 17/01/16 00:48:39 ERROR tool.ImportTool: Imported Failed: Cannot convert SQL 
> type 
> Sqoop using Kite SDK to revolve parquet dataset,  and it requires an Avro 
> schema to represent the data structure, during create avro schema sqoop can 
> not find corresponding type mapping of Types.OTHER.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (SQOOP-3173) support DB2 xml data type when sqoop import with parquet

2017-04-17 Thread Ying Cao (JIRA)

 [ 
https://issues.apache.org/jira/browse/SQOOP-3173?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ying Cao updated SQOOP-3173:

Attachment: SQOOP-3173.patch

> support DB2 xml data type when sqoop import with parquet
> 
>
> Key: SQOOP-3173
> URL: https://issues.apache.org/jira/browse/SQOOP-3173
> Project: Sqoop
>  Issue Type: Bug
>  Components: connectors
>Affects Versions: 1.4.6
> Environment: RedHat6.4+Sqoop 1.4.6 +Hadoop 2.7.3
>Reporter: Ying Cao
>Assignee: Ying Cao
>Priority: Minor
>  Labels: patch
> Fix For: 1.4.7
>
> Attachments: SQOOP-3173.patch
>
>
> When sqoop run import  job with parquet from db2, the xml data type failed 
> with :
> 17/01/16 00:48:39 ERROR tool.ImportTool: Imported Failed: Cannot convert SQL 
> type 
> Sqoop using Kite SDK to revolve parquet dataset,  and it requires an Avro 
> schema to represent the data structure, during create avro schema sqoop can 
> not find corresponding type mapping of Types.OTHER.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (SQOOP-3158) Columns added to Mysql after initial sqoop import, export back to table with same schema fails

2017-04-17 Thread viru reddy (JIRA)

[ 
https://issues.apache.org/jira/browse/SQOOP-3158?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15971769#comment-15971769
 ] 

viru reddy commented on SQOOP-3158:
---

The solution looks good.
Looks like we have to have this implemented in new version of sqoop

> Columns added to Mysql after initial sqoop import, export back to table with 
> same schema fails 
> ---
>
> Key: SQOOP-3158
> URL: https://issues.apache.org/jira/browse/SQOOP-3158
> Project: Sqoop
>  Issue Type: Improvement
>Reporter: viru reddy
>Assignee: Eric Lin
>  Labels: newbie
> Attachments: SQOOP-3158.patch
>
>
> I have table in MySQL with 2 columns until yesterday. The columns are id and 
> name.
> 1,Raj
> 2,Jack
> I have imported this data into HDFS yesterday itself as a file. Today we 
> added a new column to the table in MySQL called salary. The table looks like 
> below.
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> Now I have done Incremental import on this table as a file.
> Part-m-0 file contains
> 1,Raj
> 2,Jack
> Part-m-1 file contains
> 3,Jill,2000
> 4,Nick,3000
> Now I created a new table in MySQL with same schema as Original MySQL table 
> with columns id name and salary.
> When I do sqoop export only last 2 rows are getting inserted to the new table 
> in MySQL  and the sqoop export fails
> How can I reflect all the rows to be inserted to the table.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)