[ 
https://issues.apache.org/jira/browse/FLINK-7050?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16182525#comment-16182525
 ] 

ASF GitHub Bot commented on FLINK-7050:
---------------------------------------

Github user uybhatti commented on a diff in the pull request:

    https://github.com/apache/flink/pull/4660#discussion_r141336157
  
    --- Diff: flink-libraries/flink-table/pom.xml ---
    @@ -114,6 +114,12 @@ under the License.
                        <groupId>joda-time</groupId>
                        <artifactId>joda-time</artifactId>
                </dependency>
    +           <!-- FOR RFC 4180 Compliant CSV Parser -->
    +           <dependency>
    --- End diff --
    
    I think it's better to move this functionality into `flink-connectors` as  
`flink-connector-csv`, what do you think?


> RFC Compliant CSV Parser for Table Source
> -----------------------------------------
>
>                 Key: FLINK-7050
>                 URL: https://issues.apache.org/jira/browse/FLINK-7050
>             Project: Flink
>          Issue Type: Improvement
>          Components: Table API & SQL
>    Affects Versions: 1.3.1
>            Reporter: Usman Younas
>            Assignee: Usman Younas
>              Labels: csv, parsing
>             Fix For: 1.4.0
>
>
> Currently, Flink CSV parser is not compliant with RFC 4180. Due to this 
> issue, it was not able to parse standard csv files including double quotes 
> and delimiters with in fields etc. 
> In order to produce this bug, we can take a csv file with double quotes 
> included in field of the records and parse it using Flink CSV parser. One of 
> the issue is mentioned in the jira 
> [FLINK-4785|https://issues.apache.org/jira/browse/FLINK-4785].
> The CSV related issues will be solved by making CSV parser compliant with RFC 
> 4180 standards for Table Source. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to