[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2018-06-09 Thread Alexandr Arkhipov (JIRA)


[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506959#comment-16506959
 ] 

Alexandr Arkhipov commented on FLINK-8256:
--

Folks, I think you can close this and reduce # of open issues in Flink jira:)

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>Priority: Major
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
> at 
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
> at 
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
> ... 6 more
> 12/13/2017 15:10:01 Job execution switched to status FAILED.
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The program 
> execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
> at 
> org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:105)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:456)
> at 
> org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66)
> at 
> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.scala:638)
> at org.peopleinmotion.TestFunction$.main(TestFunction.scala:20)
> at org.peopleinmotion.TestFunction.main(TestFunction.scala)
> at 

[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-14 Thread Ryan Brideau (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16290978#comment-16290978
 ] 

Ryan Brideau commented on FLINK-8256:
-

That works for me! Thanks again.

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
> at 
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
> at 
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
> ... 6 more
> 12/13/2017 15:10:01 Job execution switched to status FAILED.
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The program 
> execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
> at 
> org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:105)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:456)
> at 
> org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66)
> at 
> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.scala:638)
> at org.peopleinmotion.TestFunction$.main(TestFunction.scala:20)
> at org.peopleinmotion.TestFunction.main(TestFunction.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> 

[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-14 Thread Stephan Ewen (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16290970#comment-16290970
 ] 

Stephan Ewen commented on FLINK-8256:
-

Great to hear!

Do you think we should close the issue, with a reference to FLINK-8264?
(which should make sure this works even when the project setup uses an older 
quickstart template)

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
> at 
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
> at 
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
> ... 6 more
> 12/13/2017 15:10:01 Job execution switched to status FAILED.
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The program 
> execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
> at 
> org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:105)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:456)
> at 
> org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66)
> at 
> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.scala:638)
> at org.peopleinmotion.TestFunction$.main(TestFunction.scala:20)
> at 

[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-14 Thread Ryan Brideau (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16290846#comment-16290846
 ] 

Ryan Brideau commented on FLINK-8256:
-

Thanks for looking into this so quickly. I managed to track down the root of 
the issue on my end. I had built my project previously using the snapshot 
archetype, and not the newest 1.4.0 one:

{code:java}
mvn archetype:generate   \
  -DarchetypeGroupId=org.apache.flink  \
  -DarchetypeArtifactId=flink-quickstart-scala \
  -DarchetypeVersion=1.4-SNAPSHOT
{code}

To fix the problem I just build a new empty project using the 1.4.0 archetype 
version and did a diff of the pom.xml of the two, updating my existing one to 
match the new one, and now everything works perfectly. I suspect that anybody 
who made a project recently might find themselves in the same situation.

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
> at 
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
> at 
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
> ... 6 more
> 12/13/2017 15:10:01 Job execution switched to status FAILED.
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The program 
> execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
> at 
> 

[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-14 Thread Stephan Ewen (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16290789#comment-16290789
 ] 

Stephan Ewen commented on FLINK-8256:
-

Please see this issue for how we plan improve the behavior for cases where 
users accidentally package the Scala Library: FLINK-8264

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
> at 
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
> at 
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
> ... 6 more
> 12/13/2017 15:10:01 Job execution switched to status FAILED.
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The program 
> execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
> at 
> org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:105)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:456)
> at 
> org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66)
> at 
> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.scala:638)
> at org.peopleinmotion.TestFunction$.main(TestFunction.scala:20)
> at org.peopleinmotion.TestFunction.main(TestFunction.scala)
> at 

[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-14 Thread Stephan Ewen (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16290780#comment-16290780
 ] 

Stephan Ewen commented on FLINK-8256:
-

[~till.rohrmann] and me managed to reproduce this by including the scala 
library into the user code jar.
That duplicates the scala classes, leading to the exception.

Please check that you build a correct jar which assumes that the core Flink 
dependencies are {{provided}}. If you use the Flink quickstarts, you get this 
my calling {{mvn clean package -Pbuild-jar}}

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
> at 
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
> at 
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
> ... 6 more
> 12/13/2017 15:10:01 Job execution switched to status FAILED.
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The program 
> execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
> at 
> org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:105)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:456)
> at 
> org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66)
> at 
> 

[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException

2017-12-14 Thread Till Rohrmann (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16290720#comment-16290720
 ] 

Till Rohrmann commented on FLINK-8256:
--

Thanks for reporting the issue [~Brideau]. I tried to reproduce the issue by 
taking the example program and submitting it to a vanilla Flink 1.4.0 cluster 
(Hadoop free version downloaded from flink.apache.org, Hadoop 2.7 version 
downloaded from flink.apache.org and version built from sources). 
Unfortunately, I was not able to reproduce the problem. Therefore I suspect 
that it has something to do with either your or my setup.

Could you maybe share all the logs with us?

How did you build the job? Did you use the Flink quickstarts?

> Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
> --
>
> Key: FLINK-8256
> URL: https://issues.apache.org/jira/browse/FLINK-8256
> Project: Flink
>  Issue Type: Bug
>  Components: DataStream API
>Affects Versions: 1.4.0
> Environment: macOS, Local Flink v1.4.0, Scala 2.11
>Reporter: Ryan Brideau
>
> I built the newest release locally today, but when I try to filter a stream 
> using an anonymous or named function, I get an error. Here's a simple example:
> {code:java}
> import org.apache.flink.api.java.utils.ParameterTool
> import org.apache.flink.streaming.api.scala._
> object TestFunction {
>   def main(args: Array[String]): Unit = {
> val env = StreamExecutionEnvironment.getExecutionEnvironment
> val params = ParameterTool.fromArgs(args)
> env.getConfig.setGlobalJobParameters(params)
> val someArray = Array(1,2,3)
> val stream = env.fromCollection(someArray).filter(_ => true)
> stream.print().setParallelism(1)
> env.execute("Testing Function")
>   }
> }
> {code}
> This results in:
> {code:java}
> Job execution switched to status FAILING.
> org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot 
> instantiate user function.
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282)
> at 
> org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126)
> at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
> at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassCastException: cannot assign instance of 
> org.peopleinmotion.TestFunction$$anonfun$1 to field 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type 
> scala.Function1 in instance of 
> org.apache.flink.streaming.api.scala.DataStream$$anon$7
> at 
> java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
> at 
> java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at 
> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282)
> at 
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206)
> at 
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064)
> at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568)
> at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428)
> at 
> org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290)
> at 
> org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248)
> at 
> org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220)
> ... 6 more
> 12/13/2017 15:10:01 Job execution switched to status FAILED.
> 
>  The program finished with the following exception:
> org.apache.flink.client.program.ProgramInvocationException: The program 
> execution failed: Job execution failed.
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492)
> at 
> org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:105)
> at 
> org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:456)
>