[ 
https://issues.apache.org/jira/browse/SQOOP-1937?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14257181#comment-14257181
 ] 

Veena Basavaraj commented on SQOOP-1937:
----------------------------------------

Also some additional points, the out.writeUTF we use in the custom 
implementation expects a string, so this mean JSON/ Avro IDF has to basically 
convert it to a string. so I am failing to understand why we could not just use 
the standard CSVString (Text) as the writable, why need a custom writable.

{code}
 out.writeUTF( ..)
{code}

> Why need  SqoopWritable, ? why not just Text?
> ---------------------------------------------
>
>                 Key: SQOOP-1937
>                 URL: https://issues.apache.org/jira/browse/SQOOP-1937
>             Project: Sqoop
>          Issue Type: Sub-task
>            Reporter: Veena Basavaraj
>
> The SqoopWritable is just underneath a Text
> The Text class in hadoop is similar with java String,while Text class 
> implemented the Interfaces like Comparable , Writable, Writable Comparable; 
> could not this be enough?
> {code}
> public class SqoopMapper extends Mapper<SqoopSplit, NullWritable, Text, 
> NullWritable> {
> {code}
> {code}
>         context.write(new Text(toIDF.getCSVTextData()), NullWritable.get());
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to