[ 
https://issues.apache.org/jira/browse/SQOOP-2038?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Veena Basavaraj updated SQOOP-2038:
-----------------------------------
    Description: 
We may not be sure that HDFS Connector uses hdfs data sets that are in sqoop 
csv string format.

so dont we need to encode the data in the correct format in the FROM side?
{code}
 } else {
        dataWriter.writeStringRecord(line.toString());
      }
{code}

  was:
At this point, we do not see the HDFS Connector sending BigDecimal for objects 
that are Sqoop Decimal type, as we prescribe JODA for date objects, we need 
BigDecimal for decimal objects.

writeSting is fine, but writeArray needs to according sqoop object format.
code to be fixed
{code}

    rowsRead++;
      if (HdfsUtils.hasCustomFormat(linkConfiguration, fromJobConfiguration)) {
        dataWriter.writeArrayRecord(HdfsUtils.formatRecord(linkConfiguration, 
fromJobConfiguration, line.toString()));
      } else {
        dataWriter.writeStringRecord(line.toString());
      }
    }

{code}



> Sqoop2: HDFSConnector should ensure writeString(text) is in sqoop CSV format
> ----------------------------------------------------------------------------
>
>                 Key: SQOOP-2038
>                 URL: https://issues.apache.org/jira/browse/SQOOP-2038
>             Project: Sqoop
>          Issue Type: Bug
>            Reporter: Veena Basavaraj
>            Assignee: Qian Xu
>             Fix For: 1.99.6
>
>
> We may not be sure that HDFS Connector uses hdfs data sets that are in sqoop 
> csv string format.
> so dont we need to encode the data in the correct format in the FROM side?
> {code}
>  } else {
>         dataWriter.writeStringRecord(line.toString());
>       }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to