[jira] [Commented] (FLINK-4108) NPE in Row.productArity

2016-10-15 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15577612#comment-15577612
 ] 

ASF GitHub Bot commented on FLINK-4108:
---

Github user asfgit closed the pull request at:

https://github.com/apache/flink/pull/2619


> NPE in Row.productArity
> ---
>
> Key: FLINK-4108
> URL: https://issues.apache.org/jira/browse/FLINK-4108
> Project: Flink
>  Issue Type: Bug
>  Components: Batch Connectors and Input/Output Formats, Type 
> Serialization System
>Affects Versions: 1.1.0
>Reporter: Martin Scholl
>Assignee: Timo Walther
>
> [this is my first issue request here, please apologize if something is 
> missing]
>  JDBCInputFormat of flink 1.1-SNAPSHOT fails with an NPE in Row.productArity:
> {quote}
> java.io.IOException: Couldn't access resultSet
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:288)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:98)
> at 
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:162)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:588)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
> at org.apache.flink.api.table.Row.productArity(Row.scala:28)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:279)
> ... 4 more
> {quote}
> The code reproduce this can be found in this gist: 
> https://gist.github.com/zeitgeist/b91a60460661618ca4585e082895c616
> The reason for the NPE, I believe, is the way through which Flink creates Row 
> instances through Kryo. As Row expects the number of fields to allocate as a 
> parameter, which Kryo does not provide, the ‘fields’ member of Row ends up 
> being null. As I’m neither a reflection nor a Kryo expert, I rather leave a 
> true analysis to more knowledgable programmers.
> Part of the aforementioned example is a not very elegant workaround though a 
> custom type and a cast (function {{jdbcNoIssue}} + custom Row type {{MyRow}}) 
> to serve as a further hint towards my theory.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4108) NPE in Row.productArity

2016-10-14 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15576491#comment-15576491
 ] 

ASF GitHub Bot commented on FLINK-4108:
---

Github user fhueske commented on a diff in the pull request:

https://github.com/apache/flink/pull/2619#discussion_r83502181
  
--- Diff: 
flink-scala/src/main/scala/org/apache/flink/api/scala/ExecutionEnvironment.scala
 ---
@@ -110,7 +110,7 @@ class ExecutionEnvironment(javaEnv: JavaEnv) {
 */
   @PublicEvolving
   def getRestartStrategy: RestartStrategyConfiguration = {
-javaEnv.getRestartStrategy()
+javaEnv.getRestartStrategy
--- End diff --

Apparently this is not the case in this class. Similar method are also 
called without parentheses. So lets keep this change to have consistent code in 
this class.


> NPE in Row.productArity
> ---
>
> Key: FLINK-4108
> URL: https://issues.apache.org/jira/browse/FLINK-4108
> Project: Flink
>  Issue Type: Bug
>  Components: Batch Connectors and Input/Output Formats, Type 
> Serialization System
>Affects Versions: 1.1.0
>Reporter: Martin Scholl
>Assignee: Timo Walther
>
> [this is my first issue request here, please apologize if something is 
> missing]
>  JDBCInputFormat of flink 1.1-SNAPSHOT fails with an NPE in Row.productArity:
> {quote}
> java.io.IOException: Couldn't access resultSet
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:288)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:98)
> at 
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:162)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:588)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
> at org.apache.flink.api.table.Row.productArity(Row.scala:28)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:279)
> ... 4 more
> {quote}
> The code reproduce this can be found in this gist: 
> https://gist.github.com/zeitgeist/b91a60460661618ca4585e082895c616
> The reason for the NPE, I believe, is the way through which Flink creates Row 
> instances through Kryo. As Row expects the number of fields to allocate as a 
> parameter, which Kryo does not provide, the ‘fields’ member of Row ends up 
> being null. As I’m neither a reflection nor a Kryo expert, I rather leave a 
> true analysis to more knowledgable programmers.
> Part of the aforementioned example is a not very elegant workaround though a 
> custom type and a cast (function {{jdbcNoIssue}} + custom Row type {{MyRow}}) 
> to serve as a further hint towards my theory.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4108) NPE in Row.productArity

2016-10-14 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15576439#comment-15576439
 ] 

ASF GitHub Bot commented on FLINK-4108:
---

Github user fhueske commented on the issue:

https://github.com/apache/flink/pull/2619
  
Reverting the change and merging


> NPE in Row.productArity
> ---
>
> Key: FLINK-4108
> URL: https://issues.apache.org/jira/browse/FLINK-4108
> Project: Flink
>  Issue Type: Bug
>  Components: Batch Connectors and Input/Output Formats, Type 
> Serialization System
>Affects Versions: 1.1.0
>Reporter: Martin Scholl
>Assignee: Timo Walther
>
> [this is my first issue request here, please apologize if something is 
> missing]
>  JDBCInputFormat of flink 1.1-SNAPSHOT fails with an NPE in Row.productArity:
> {quote}
> java.io.IOException: Couldn't access resultSet
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:288)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:98)
> at 
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:162)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:588)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
> at org.apache.flink.api.table.Row.productArity(Row.scala:28)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:279)
> ... 4 more
> {quote}
> The code reproduce this can be found in this gist: 
> https://gist.github.com/zeitgeist/b91a60460661618ca4585e082895c616
> The reason for the NPE, I believe, is the way through which Flink creates Row 
> instances through Kryo. As Row expects the number of fields to allocate as a 
> parameter, which Kryo does not provide, the ‘fields’ member of Row ends up 
> being null. As I’m neither a reflection nor a Kryo expert, I rather leave a 
> true analysis to more knowledgable programmers.
> Part of the aforementioned example is a not very elegant workaround though a 
> custom type and a cast (function {{jdbcNoIssue}} + custom Row type {{MyRow}}) 
> to serve as a further hint towards my theory.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4108) NPE in Row.productArity

2016-10-14 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15576308#comment-15576308
 ] 

ASF GitHub Bot commented on FLINK-4108:
---

Github user fhueske commented on a diff in the pull request:

https://github.com/apache/flink/pull/2619#discussion_r83489370
  
--- Diff: 
flink-scala/src/main/scala/org/apache/flink/api/scala/ExecutionEnvironment.scala
 ---
@@ -110,7 +110,7 @@ class ExecutionEnvironment(javaEnv: JavaEnv) {
 */
   @PublicEvolving
   def getRestartStrategy: RestartStrategyConfiguration = {
-javaEnv.getRestartStrategy()
+javaEnv.getRestartStrategy
--- End diff --

We call methods on Java objects with parentheses. 


> NPE in Row.productArity
> ---
>
> Key: FLINK-4108
> URL: https://issues.apache.org/jira/browse/FLINK-4108
> Project: Flink
>  Issue Type: Bug
>  Components: Batch Connectors and Input/Output Formats, Type 
> Serialization System
>Affects Versions: 1.1.0
>Reporter: Martin Scholl
>Assignee: Timo Walther
>
> [this is my first issue request here, please apologize if something is 
> missing]
>  JDBCInputFormat of flink 1.1-SNAPSHOT fails with an NPE in Row.productArity:
> {quote}
> java.io.IOException: Couldn't access resultSet
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:288)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:98)
> at 
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:162)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:588)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
> at org.apache.flink.api.table.Row.productArity(Row.scala:28)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:279)
> ... 4 more
> {quote}
> The code reproduce this can be found in this gist: 
> https://gist.github.com/zeitgeist/b91a60460661618ca4585e082895c616
> The reason for the NPE, I believe, is the way through which Flink creates Row 
> instances through Kryo. As Row expects the number of fields to allocate as a 
> parameter, which Kryo does not provide, the ‘fields’ member of Row ends up 
> being null. As I’m neither a reflection nor a Kryo expert, I rather leave a 
> true analysis to more knowledgable programmers.
> Part of the aforementioned example is a not very elegant workaround though a 
> custom type and a cast (function {{jdbcNoIssue}} + custom Row type {{MyRow}}) 
> to serve as a further hint towards my theory.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4108) NPE in Row.productArity

2016-10-12 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15568712#comment-15568712
 ] 

ASF GitHub Bot commented on FLINK-4108:
---

Github user albertoRamon commented on the issue:

https://github.com/apache/flink/pull/2619
  
Works OK  ¡¡
[Capture 
Output](http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/jdbc-JDBCInputFormat-tp9393p9489.html)

Thanks, Alb
 


> NPE in Row.productArity
> ---
>
> Key: FLINK-4108
> URL: https://issues.apache.org/jira/browse/FLINK-4108
> Project: Flink
>  Issue Type: Bug
>  Components: Batch Connectors and Input/Output Formats, Type 
> Serialization System
>Affects Versions: 1.1.0
>Reporter: Martin Scholl
>Assignee: Timo Walther
>
> [this is my first issue request here, please apologize if something is 
> missing]
>  JDBCInputFormat of flink 1.1-SNAPSHOT fails with an NPE in Row.productArity:
> {quote}
> java.io.IOException: Couldn't access resultSet
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:288)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:98)
> at 
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:162)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:588)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
> at org.apache.flink.api.table.Row.productArity(Row.scala:28)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:279)
> ... 4 more
> {quote}
> The code reproduce this can be found in this gist: 
> https://gist.github.com/zeitgeist/b91a60460661618ca4585e082895c616
> The reason for the NPE, I believe, is the way through which Flink creates Row 
> instances through Kryo. As Row expects the number of fields to allocate as a 
> parameter, which Kryo does not provide, the ‘fields’ member of Row ends up 
> being null. As I’m neither a reflection nor a Kryo expert, I rather leave a 
> true analysis to more knowledgable programmers.
> Part of the aforementioned example is a not very elegant workaround though a 
> custom type and a cast (function {{jdbcNoIssue}} + custom Row type {{MyRow}}) 
> to serve as a further hint towards my theory.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4108) NPE in Row.productArity

2016-10-11 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15565000#comment-15565000
 ] 

ASF GitHub Bot commented on FLINK-4108:
---

GitHub user twalthr opened a pull request:

https://github.com/apache/flink/pull/2619

[FLINK-4108] [scala] Consider ResultTypeQueryable for input formats

Thanks for contributing to Apache Flink. Before you open your pull request, 
please take the following check list into consideration.
If your changes take all of the items into account, feel free to open your 
pull request. For more information and/or questions please refer to the [How To 
Contribute guide](http://flink.apache.org/how-to-contribute.html).
In addition to going through the list, please provide a meaningful 
description of your changes.

- [x] General
  - The pull request references the related JIRA issue ("[FLINK-XXX] Jira 
title text")
  - The pull request addresses only one issue
  - Each commit in the PR has a meaningful commit message (including the 
JIRA id)

- [x] Documentation
  - Documentation has been added for new functionality
  - Old documentation affected by the pull request has been updated
  - JavaDoc for public methods has been added

- [x] Tests & Build
  - Functionality added by the pull request is covered by tests
  - `mvn clean verify` has been executed successfully locally or a Travis 
build has passed

This PR fixes issues with input formats (i.e. the JDBC input format) in the 
Scala API. Now the API also considers explicit information defines with 
ResultTypeQueryable.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/twalthr/flink FLINK-4108

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/2619.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2619


commit 24b0cac3b58c1ea52b3abc7af3484f6701ab5047
Author: twalthr 
Date:   2016-10-11T09:19:32Z

[FLINK-4108] [scala] Consider ResultTypeQueryable for input formats




> NPE in Row.productArity
> ---
>
> Key: FLINK-4108
> URL: https://issues.apache.org/jira/browse/FLINK-4108
> Project: Flink
>  Issue Type: Bug
>  Components: Batch Connectors and Input/Output Formats, Type 
> Serialization System
>Affects Versions: 1.1.0
>Reporter: Martin Scholl
>Assignee: Timo Walther
>
> [this is my first issue request here, please apologize if something is 
> missing]
>  JDBCInputFormat of flink 1.1-SNAPSHOT fails with an NPE in Row.productArity:
> {quote}
> java.io.IOException: Couldn't access resultSet
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:288)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:98)
> at 
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:162)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:588)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
> at org.apache.flink.api.table.Row.productArity(Row.scala:28)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:279)
> ... 4 more
> {quote}
> The code reproduce this can be found in this gist: 
> https://gist.github.com/zeitgeist/b91a60460661618ca4585e082895c616
> The reason for the NPE, I believe, is the way through which Flink creates Row 
> instances through Kryo. As Row expects the number of fields to allocate as a 
> parameter, which Kryo does not provide, the ‘fields’ member of Row ends up 
> being null. As I’m neither a reflection nor a Kryo expert, I rather leave a 
> true analysis to more knowledgable programmers.
> Part of the aforementioned example is a not very elegant workaround though a 
> custom type and a cast (function {{jdbcNoIssue}} + custom Row type {{MyRow}}) 
> to serve as a further hint towards my theory.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-4108) NPE in Row.productArity

2016-10-10 Thread Timo Walther (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-4108?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15561763#comment-15561763
 ] 

Timo Walther commented on FLINK-4108:
-

Actually, RowTypeInfo was intended for the Table API only. If it is used 
outside of the Table API there should be additional tests.

> NPE in Row.productArity
> ---
>
> Key: FLINK-4108
> URL: https://issues.apache.org/jira/browse/FLINK-4108
> Project: Flink
>  Issue Type: Bug
>  Components: Batch Connectors and Input/Output Formats, Type 
> Serialization System
>Affects Versions: 1.1.0
>Reporter: Martin Scholl
>
> [this is my first issue request here, please apologize if something is 
> missing]
>  JDBCInputFormat of flink 1.1-SNAPSHOT fails with an NPE in Row.productArity:
> {quote}
> java.io.IOException: Couldn't access resultSet
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:288)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:98)
> at 
> org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:162)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:588)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.NullPointerException
> at org.apache.flink.api.table.Row.productArity(Row.scala:28)
> at 
> org.apache.flink.api.java.io.jdbc.JDBCInputFormat.nextRecord(JDBCInputFormat.java:279)
> ... 4 more
> {quote}
> The code reproduce this can be found in this gist: 
> https://gist.github.com/zeitgeist/b91a60460661618ca4585e082895c616
> The reason for the NPE, I believe, is the way through which Flink creates Row 
> instances through Kryo. As Row expects the number of fields to allocate as a 
> parameter, which Kryo does not provide, the ‘fields’ member of Row ends up 
> being null. As I’m neither a reflection nor a Kryo expert, I rather leave a 
> true analysis to more knowledgable programmers.
> Part of the aforementioned example is a not very elegant workaround though a 
> custom type and a cast (function {{jdbcNoIssue}} + custom Row type {{MyRow}}) 
> to serve as a further hint towards my theory.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)