ASF GitHub Bot commented on FLINK-4108:

GitHub user twalthr opened a pull request:


    [FLINK-4108] [scala] Consider ResultTypeQueryable for input formats

    Thanks for contributing to Apache Flink. Before you open your pull request, 
please take the following check list into consideration.
    If your changes take all of the items into account, feel free to open your 
pull request. For more information and/or questions please refer to the [How To 
Contribute guide](http://flink.apache.org/how-to-contribute.html).
    In addition to going through the list, please provide a meaningful 
description of your changes.
    - [x] General
      - The pull request references the related JIRA issue ("[FLINK-XXX] Jira 
title text")
      - The pull request addresses only one issue
      - Each commit in the PR has a meaningful commit message (including the 
JIRA id)
    - [x] Documentation
      - Documentation has been added for new functionality
      - Old documentation affected by the pull request has been updated
      - JavaDoc for public methods has been added
    - [x] Tests & Build
      - Functionality added by the pull request is covered by tests
      - `mvn clean verify` has been executed successfully locally or a Travis 
build has passed
    This PR fixes issues with input formats (i.e. the JDBC input format) in the 
Scala API. Now the API also considers explicit information defines with 

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/twalthr/flink FLINK-4108

Alternatively you can review and apply these changes as the patch at:


To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2619
commit 24b0cac3b58c1ea52b3abc7af3484f6701ab5047
Author: twalthr <twal...@apache.org>
Date:   2016-10-11T09:19:32Z

    [FLINK-4108] [scala] Consider ResultTypeQueryable for input formats


> NPE in Row.productArity
> -----------------------
>                 Key: FLINK-4108
>                 URL: https://issues.apache.org/jira/browse/FLINK-4108
>             Project: Flink
>          Issue Type: Bug
>          Components: Batch Connectors and Input/Output Formats, Type 
> Serialization System
>    Affects Versions: 1.1.0
>            Reporter: Martin Scholl
>            Assignee: Timo Walther
> [this is my first issue request here, please apologize if something is 
> missing]
>  JDBCInputFormat of flink 1.1-SNAPSHOT fails with an NPE in Row.productArity:
> {quote}
> java.io.IOException: Couldn't access resultSet
>         at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:288)
>         at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:98)
>         at 
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:162)
>         at org.apache.flink.runtime.taskmanager.Task.run(Task.java:588)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
>         at org.apache.flink.api.table.Row.productArity(Row.scala:28)
>         at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:279)
>         ... 4 more
> {quote}
> The code reproduce this can be found in this gist: 
> https://gist.github.com/zeitgeist/b91a60460661618ca4585e082895c616
> The reason for the NPE, I believe, is the way through which Flink creates Row 
> instances through Kryo. As Row expects the number of fields to allocate as a 
> parameter, which Kryo does not provide, the ‘fields’ member of Row ends up 
> being null. As I’m neither a reflection nor a Kryo expert, I rather leave a 
> true analysis to more knowledgable programmers.
> Part of the aforementioned example is a not very elegant workaround though a 
> custom type and a cast (function {{jdbcNoIssue}} + custom Row type {{MyRow}}) 
> to serve as a further hint towards my theory.

This message was sent by Atlassian JIRA

Reply via email to