[jira] [Updated] (FLINK-9562) Wrong wording in flink-optimizer module
[ https://issues.apache.org/jira/browse/FLINK-9562?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Alexandr Arkhipov updated FLINK-9562: - Description: There are some wrong wording in flink-optimizer module. Especialy with the following files: # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\DagConnection.java - DagConnection() function - in exception string # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\TwoInputNode.java - getAlternativePlans() function - in comments # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\SortPartitionNode.java - computeLocalProperties() function - in comments # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\costs\Costs.java - compareTo() function - in comments was: There are some wrong wording in flink-optimizer module. Especialy with the following files: # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\DagConnection.java - DagConnection() function - in exception string # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\TwoInputNode.java - getAlternativePlans() function - in comments # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\SortPartitionNode.java - computeLocalProperties() function - in comments # > Wrong wording in flink-optimizer module > --- > > Key: FLINK-9562 > URL: https://issues.apache.org/jira/browse/FLINK-9562 > Project: Flink > Issue Type: Bug > Components: Optimizer >Affects Versions: 1.6.0 > Environment: Flink-1.6-SNAPSHOT >Reporter: Alexandr Arkhipov >Assignee: Alexandr Arkhipov >Priority: Trivial > Labels: starter > > There are some wrong wording in flink-optimizer module. Especialy with the > following files: > # > flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\DagConnection.java > - DagConnection() function - in exception string > # > flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\TwoInputNode.java > - getAlternativePlans() function - in comments > # > flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\SortPartitionNode.java > - computeLocalProperties() function - in comments > # > flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\costs\Costs.java > - compareTo() function - in comments -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Updated] (FLINK-9562) Wrong wording in flink-optimizer module
[ https://issues.apache.org/jira/browse/FLINK-9562?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Alexandr Arkhipov updated FLINK-9562: - Description: There are some wrong wording in flink-optimizer module. Especialy with the following files: # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\DagConnection.java - DagConnection() function - in exception string # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\TwoInputNode.java - getAlternativePlans() function - in comments # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\SortPartitionNode.java - computeLocalProperties() function - in comments # was: There are some wrong wording in flink-optimizer module. Especialy with the following files: # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\DagConnection.java - DagConnection() function - in exception string # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\TwoInputNode.java - getAlternativePlans() function - in comments > Wrong wording in flink-optimizer module > --- > > Key: FLINK-9562 > URL: https://issues.apache.org/jira/browse/FLINK-9562 > Project: Flink > Issue Type: Bug > Components: Optimizer >Affects Versions: 1.6.0 > Environment: Flink-1.6-SNAPSHOT >Reporter: Alexandr Arkhipov >Assignee: Alexandr Arkhipov >Priority: Trivial > Labels: starter > > There are some wrong wording in flink-optimizer module. Especialy with the > following files: > # > flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\DagConnection.java > - DagConnection() function - in exception string > # > flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\TwoInputNode.java > - getAlternativePlans() function - in comments > # > flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\SortPartitionNode.java > - computeLocalProperties() function - in comments > # -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Created] (FLINK-9562) Wrong wording in flink-optimizer module
Alexandr Arkhipov created FLINK-9562: Summary: Wrong wording in flink-optimizer module Key: FLINK-9562 URL: https://issues.apache.org/jira/browse/FLINK-9562 Project: Flink Issue Type: Bug Components: Optimizer Affects Versions: 1.6.0 Environment: Flink-1.6-SNAPSHOT Reporter: Alexandr Arkhipov Assignee: Alexandr Arkhipov There are some wrong wording in flink-optimizer module. Especialy with the following files: # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\DagConnection.java - DagConnection() function - in exception string # flink\flink-optimizer\src\main\java\org\apache\flink\optimizer\dag\TwoInputNode.java - getAlternativePlans() function - in comments -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (FLINK-8256) Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException
[ https://issues.apache.org/jira/browse/FLINK-8256?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506959#comment-16506959 ] Alexandr Arkhipov commented on FLINK-8256: -- Folks, I think you can close this and reduce # of open issues in Flink jira:) > Cannot use Scala functions to filter in 1.4 - java.lang.ClassCastException > -- > > Key: FLINK-8256 > URL: https://issues.apache.org/jira/browse/FLINK-8256 > Project: Flink > Issue Type: Bug > Components: DataStream API >Affects Versions: 1.4.0 > Environment: macOS, Local Flink v1.4.0, Scala 2.11 >Reporter: Ryan Brideau >Priority: Major > > I built the newest release locally today, but when I try to filter a stream > using an anonymous or named function, I get an error. Here's a simple example: > {code:java} > import org.apache.flink.api.java.utils.ParameterTool > import org.apache.flink.streaming.api.scala._ > object TestFunction { > def main(args: Array[String]): Unit = { > val env = StreamExecutionEnvironment.getExecutionEnvironment > val params = ParameterTool.fromArgs(args) > env.getConfig.setGlobalJobParameters(params) > val someArray = Array(1,2,3) > val stream = env.fromCollection(someArray).filter(_ => true) > stream.print().setParallelism(1) > env.execute("Testing Function") > } > } > {code} > This results in: > {code:java} > Job execution switched to status FAILING. > org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot > instantiate user function. > at > org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:235) > at > org.apache.flink.streaming.runtime.tasks.OperatorChain.createChainedOperator(OperatorChain.java:355) > at > org.apache.flink.streaming.runtime.tasks.OperatorChain.createOutputCollector(OperatorChain.java:282) > at > org.apache.flink.streaming.runtime.tasks.OperatorChain.(OperatorChain.java:126) > at > org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231) > at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718) > at java.lang.Thread.run(Thread.java:748) > Caused by: java.lang.ClassCastException: cannot assign instance of > org.peopleinmotion.TestFunction$$anonfun$1 to field > org.apache.flink.streaming.api.scala.DataStream$$anon$7.cleanFun$6 of type > scala.Function1 in instance of > org.apache.flink.streaming.api.scala.DataStream$$anon$7 > at > java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233) > at > java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405) > at > java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288) > at > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206) > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064) > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568) > at > java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2282) > at > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2206) > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2064) > at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1568) > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:428) > at > org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:290) > at > org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:248) > at > org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:220) > ... 6 more > 12/13/2017 15:10:01 Job execution switched to status FAILED. > > The program finished with the following exception: > org.apache.flink.client.program.ProgramInvocationException: The program > execution failed: Job execution failed. > at > org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:492) > at > org.apache.flink.client.program.StandaloneClusterClient.submitJob(StandaloneClusterClient.java:105) > at > org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:456) > at > org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:66) > at > org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.scala:638) > at org.peopleinmotion.TestFunction$.main(TestFunction.scala:20) > at org.peopleinmotion.TestFunction.main(TestFunction.scala) > at
[jira] [Comment Edited] (FLINK-6977) Add MD5/SHA1/SHA2 supported in TableAPI
[ https://issues.apache.org/jira/browse/FLINK-6977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506944#comment-16506944 ] Alexandr Arkhipov edited comment on FLINK-6977 at 6/9/18 12:10 PM: --- I have investigated the code and it seams that this has already been implemented under FLINK-6926. Please close this ticked as already fixed. Thank you. was (Author: alex.arkhipov): I have investigated the code and it seams that this has already been implemented under https://issues.apache.org/jira/browse/FLINK-6926. Please close this ticked as already fixed. Thank you. > Add MD5/SHA1/SHA2 supported in TableAPI > --- > > Key: FLINK-6977 > URL: https://issues.apache.org/jira/browse/FLINK-6977 > Project: Flink > Issue Type: Sub-task > Components: Table API SQL >Affects Versions: 1.4.0 >Reporter: sunjincheng >Assignee: Alexandr Arkhipov >Priority: Major > Labels: starter > > See FLINK-6926 for detail. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (FLINK-6977) Add MD5/SHA1/SHA2 supported in TableAPI
[ https://issues.apache.org/jira/browse/FLINK-6977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16506944#comment-16506944 ] Alexandr Arkhipov commented on FLINK-6977: -- I have investigated the code and it seams that this has already been implemented under https://issues.apache.org/jira/browse/FLINK-6926. Please close this ticked as already fixed. Thank you. > Add MD5/SHA1/SHA2 supported in TableAPI > --- > > Key: FLINK-6977 > URL: https://issues.apache.org/jira/browse/FLINK-6977 > Project: Flink > Issue Type: Sub-task > Components: Table API SQL >Affects Versions: 1.4.0 >Reporter: sunjincheng >Assignee: Alexandr Arkhipov >Priority: Major > Labels: starter > > See FLINK-6926 for detail. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (FLINK-5550) NotFoundException: Could not find job with id
[ https://issues.apache.org/jira/browse/FLINK-5550?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16494118#comment-16494118 ] Alexandr Arkhipov commented on FLINK-5550: -- Would you mind if I take this issue and work on it? > NotFoundException: Could not find job with id > - > > Key: FLINK-5550 > URL: https://issues.apache.org/jira/browse/FLINK-5550 > Project: Flink > Issue Type: Bug > Components: Webfrontend >Affects Versions: 1.0.3 > Environment: centos >Reporter: jiwengang >Priority: Minor > Labels: newbie > > Job is canceled, but still report the following exception: > 2017-01-18 10:35:18,677 WARN > org.apache.flink.runtime.webmonitor.RuntimeMonitorHandler - Error while > handling request > org.apache.flink.runtime.webmonitor.NotFoundException: Could not find job > with id 3b98e734c868cc2b992743cfe8911ad0 > at > org.apache.flink.runtime.webmonitor.handlers.AbstractExecutionGraphRequestHandler.handleRequest(AbstractExecutionGraphRequestHandler.java:58) > at > org.apache.flink.runtime.webmonitor.RuntimeMonitorHandler.respondAsLeader(RuntimeMonitorHandler.java:135) > at > org.apache.flink.runtime.webmonitor.RuntimeMonitorHandler.channelRead0(RuntimeMonitorHandler.java:112) > at > org.apache.flink.runtime.webmonitor.RuntimeMonitorHandler.channelRead0(RuntimeMonitorHandler.java:60) > at > io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) > at > io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) > at io.netty.handler.codec.http.router.Handler.routed(Handler.java:62) > at > io.netty.handler.codec.http.router.DualAbstractHandler.channelRead0(DualAbstractHandler.java:57) > at > io.netty.handler.codec.http.router.DualAbstractHandler.channelRead0(DualAbstractHandler.java:20) > at > io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) > at > io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) > at > org.apache.flink.runtime.webmonitor.HttpRequestHandler.channelRead0(HttpRequestHandler.java:104) > at > org.apache.flink.runtime.webmonitor.HttpRequestHandler.channelRead0(HttpRequestHandler.java:65) > at > io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) > at > io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) > at > io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242) > at > io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:147) > at > io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339) > at > io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324) > at > io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847) > at > io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) > at > io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) > at > io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) > at > io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) > at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) > at > io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) > at > io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137) > at java.lang.Thread.run(Thread.java:745) -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (FLINK-6977) Add MD5/SHA1/SHA2 supported in TableAPI
[ https://issues.apache.org/jira/browse/FLINK-6977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16494097#comment-16494097 ] Alexandr Arkhipov commented on FLINK-6977: -- Is there someone working on this? If not, would you mind if I take this and fix? > Add MD5/SHA1/SHA2 supported in TableAPI > --- > > Key: FLINK-6977 > URL: https://issues.apache.org/jira/browse/FLINK-6977 > Project: Flink > Issue Type: Sub-task > Components: Table API SQL >Affects Versions: 1.4.0 >Reporter: sunjincheng >Priority: Major > Labels: starter > > See FLINK-6926 for detail. -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (FLINK-7259) Match not exhaustive in TaskMonitor
[ https://issues.apache.org/jira/browse/FLINK-7259?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16492633#comment-16492633 ] Alexandr Arkhipov commented on FLINK-7259: -- This one has already been resolved by FLINK-8931 > Match not exhaustive in TaskMonitor > --- > > Key: FLINK-7259 > URL: https://issues.apache.org/jira/browse/FLINK-7259 > Project: Flink > Issue Type: Bug > Components: Mesos >Affects Versions: 1.3.1, 1.4.0 >Reporter: Till Rohrmann >Priority: Major > > Two matches are not exhaustive in the class {{TaskMonitor}}. This can lead to > a {{MatchError}}, potentially restarting this actor (depending on the > supervision strategy). > {code} > > /Users/uce/Code/flink/flink-mesos/src/main/scala/org/apache/flink/mesos/scheduler/TaskMonitor.scala:157: > warning: match may not be exhaustive. > [WARNING] It would fail on the following input: TASK_KILLING > [WARNING] msg.status().getState match { > [WARNING]^ > [WARNING] > /Users/uce/Code/flink/flink-mesos/src/main/scala/org/apache/flink/mesos/scheduler/TaskMonitor.scala:170: > warning: match may not be exhaustive. > [WARNING] It would fail on the following input: TASK_KILLING > [WARNING] msg.status().getState match { > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005)