[jira] [Commented] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-06-10 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15325447#comment-15325447
 ] 

Todd Lisonbee commented on FLINK-3741:
--

Makes sense to close.  It did seem like a transient issue.  It can always be 
re-opened later, if needed.

Closing this issue.  Thanks.

> Travis Compile Error: MissingRequirementError: object scala.runtime in 
> compiler mirror not found.
> -
>
> Key: FLINK-3741
> URL: https://issues.apache.org/jira/browse/FLINK-3741
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>  Labels: CI, build
>
> Build failed on one of my pull requests and at least 3 others from other 
> people.
> Seems like problem is in latest master as of 4/11/2016 (my pull request only 
> had a Javadoc comment added).
> OpenJDK 7, hadoop.profile=1 and other profiles
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589293/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589296/log.txt
> Error:
> [INFO] 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
> info: compiling
> [INFO] Compiling 43 source files to 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
> 1460421450188
> [ERROR] error: error while loading , error in opening zip file
> [ERROR] error: scala.reflect.internal.MissingRequirementError: object 
> scala.runtime in compiler mirror not found.
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> [INFO]at scala.tools.nsc.Global$Run.(Global.scala:1290)
> [INFO]at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
> [INFO]at scala.tools.nsc.Main$.doCompile(Main.scala:79)
> [INFO]at scala.tools.nsc.Driver.process(Driver.scala:54)
> [INFO]at scala.tools.nsc.Driver.main(Driver.scala:67)
> [INFO]at scala.tools.nsc.Main.main(Main.scala)
> [INFO]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [INFO]at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> [INFO]at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> [INFO]at java.lang.reflect.Method.invoke(Method.java:606)
> [INFO]at 
> org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
> [INFO]at 
> org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Closed] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-06-10 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee closed FLINK-3741.

Resolution: Cannot Reproduce

> Travis Compile Error: MissingRequirementError: object scala.runtime in 
> compiler mirror not found.
> -
>
> Key: FLINK-3741
> URL: https://issues.apache.org/jira/browse/FLINK-3741
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>  Labels: CI, build
>
> Build failed on one of my pull requests and at least 3 others from other 
> people.
> Seems like problem is in latest master as of 4/11/2016 (my pull request only 
> had a Javadoc comment added).
> OpenJDK 7, hadoop.profile=1 and other profiles
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589293/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589296/log.txt
> Error:
> [INFO] 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
> info: compiling
> [INFO] Compiling 43 source files to 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
> 1460421450188
> [ERROR] error: error while loading , error in opening zip file
> [ERROR] error: scala.reflect.internal.MissingRequirementError: object 
> scala.runtime in compiler mirror not found.
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> [INFO]at scala.tools.nsc.Global$Run.(Global.scala:1290)
> [INFO]at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
> [INFO]at scala.tools.nsc.Main$.doCompile(Main.scala:79)
> [INFO]at scala.tools.nsc.Driver.process(Driver.scala:54)
> [INFO]at scala.tools.nsc.Driver.main(Driver.scala:67)
> [INFO]at scala.tools.nsc.Main.main(Main.scala)
> [INFO]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [INFO]at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> [INFO]at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> [INFO]at java.lang.reflect.Method.invoke(Method.java:606)
> [INFO]at 
> org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
> [INFO]at 
> org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-05-23 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15296549#comment-15296549
 ] 

Todd Lisonbee commented on FLINK-3741:
--

This issue has possibly been fixed in the Scala compiler, 
https://issues.scala-lang.org/browse/SI-5463

Version: Scala 2.12.0-M5

https://github.com/scala/scala/pull/5153

> Travis Compile Error: MissingRequirementError: object scala.runtime in 
> compiler mirror not found.
> -
>
> Key: FLINK-3741
> URL: https://issues.apache.org/jira/browse/FLINK-3741
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>  Labels: CI, build
>
> Build failed on one of my pull requests and at least 3 others from other 
> people.
> Seems like problem is in latest master as of 4/11/2016 (my pull request only 
> had a Javadoc comment added).
> OpenJDK 7, hadoop.profile=1 and other profiles
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589293/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589296/log.txt
> Error:
> [INFO] 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
> info: compiling
> [INFO] Compiling 43 source files to 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
> 1460421450188
> [ERROR] error: error while loading , error in opening zip file
> [ERROR] error: scala.reflect.internal.MissingRequirementError: object 
> scala.runtime in compiler mirror not found.
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> [INFO]at scala.tools.nsc.Global$Run.(Global.scala:1290)
> [INFO]at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
> [INFO]at scala.tools.nsc.Main$.doCompile(Main.scala:79)
> [INFO]at scala.tools.nsc.Driver.process(Driver.scala:54)
> [INFO]at scala.tools.nsc.Driver.main(Driver.scala:67)
> [INFO]at scala.tools.nsc.Main.main(Main.scala)
> [INFO]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [INFO]at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> [INFO]at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> [INFO]at java.lang.reflect.Method.invoke(Method.java:606)
> [INFO]at 
> org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
> [INFO]at 
> org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Closed] (FLINK-3744) LocalFlinkMiniClusterITCase times out occasionally when building locally

2016-04-14 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee closed FLINK-3744.

Resolution: Cannot Reproduce

> LocalFlinkMiniClusterITCase times out occasionally when building locally
> 
>
> Key: FLINK-3744
> URL: https://issues.apache.org/jira/browse/FLINK-3744
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>Priority: Trivial
>  Labels: build, flaky-test
>
> When building locally 
> LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers 
> timedout.  
> Out of many local builds this has only happened to me once.  This test 
> immediately passed when I ran `mvn verify` a second time.
> Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 23.139 sec 
> <<< FAILURE! - in 
> org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase
> testLocalFlinkMiniClusterWithMultipleTaskManagers(org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase)
>   Time elapsed: 23.087 sec  <<< ERROR!
> java.util.concurrent.TimeoutException: Futures timed out after [1 
> milliseconds]
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
>   at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
>   at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
>   at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>   at scala.concurrent.Await$.ready(package.scala:86)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:455)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:439)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:330)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:269)
>   at 
> org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers(LocalFlinkMiniClusterITCase.java:73)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3744) LocalFlinkMiniClusterITCase times out occasionally when building locally

2016-04-14 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15241537#comment-15241537
 ] 

Todd Lisonbee commented on FLINK-3744:
--

I didn't capture more of the logs when it happened.

I just tried running this test in a loop well over 200 times with no repeat of 
the error.

Closing this issue (someone can re-open if they see it again).

> LocalFlinkMiniClusterITCase times out occasionally when building locally
> 
>
> Key: FLINK-3744
> URL: https://issues.apache.org/jira/browse/FLINK-3744
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>Priority: Trivial
>  Labels: build, flaky-test
>
> When building locally 
> LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers 
> timedout.  
> Out of many local builds this has only happened to me once.  This test 
> immediately passed when I ran `mvn verify` a second time.
> Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 23.139 sec 
> <<< FAILURE! - in 
> org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase
> testLocalFlinkMiniClusterWithMultipleTaskManagers(org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase)
>   Time elapsed: 23.087 sec  <<< ERROR!
> java.util.concurrent.TimeoutException: Futures timed out after [1 
> milliseconds]
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
>   at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
>   at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
>   at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>   at scala.concurrent.Await$.ready(package.scala:86)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:455)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:439)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:330)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:269)
>   at 
> org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers(LocalFlinkMiniClusterITCase.java:73)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3745) TimestampITCase testWatermarkPropagationNoFinalWatermarkOnStop failing intermittently

2016-04-12 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3745?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3745:
-
Description: 
Test failed randomly in Travis,
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122624297/log.txt

{noformat}
java.lang.Exception: Stopping the job with ID ef892dfdf31b74a9a3da991d2240716e 
failed.
at 
org.apache.flink.runtime.minicluster.LocalFlinkMiniCluster.stopJob(LocalFlinkMiniCluster.scala:283)
at 
org.apache.flink.streaming.timestamp.TimestampITCase$1.run(TimestampITCase.java:213)
Caused by: java.lang.IllegalStateException: Job with ID 
ef892dfdf31b74a9a3da991d2240716e is in state FAILING but stopping is only 
allowed in state RUNNING.
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1.applyOrElse(JobManager.scala:577)
at 
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at 
org.apache.flink.runtime.testingUtils.TestingJobManagerLike$$anonfun$handleTestingMessage$1.applyOrElse(TestingJobManagerLike.scala:90)
at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
at 
org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at 
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at 
org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at 
org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
at 
org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at 
org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:113)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at 
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Tests run: 12, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 15.087 sec <<< 
FAILURE! - in org.apache.flink.streaming.timestamp.TimestampITCase
testWatermarkPropagationNoFinalWatermarkOnStop(org.apache.flink.streaming.timestamp.TimestampITCase)
  Time elapsed: 0.792 sec  <<< ERROR!
org.apache.flink.client.program.ProgramInvocationException: The program 
execution failed: Job execution failed.
at org.apache.flink.client.program.Client.runBlocking(Client.java:381)
at org.apache.flink.client.program.Client.runBlocking(Client.java:355)
at org.apache.flink.client.program.Client.runBlocking(Client.java:348)
at 
org.apache.flink.streaming.api.environment.RemoteStreamEnvironment.executeRemotely(RemoteStreamEnvironment.java:206)
at 
org.apache.flink.streaming.api.environment.RemoteStreamEnvironment.execute(RemoteStreamEnvironment.java:172)
at 
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1170)
at 
org.apache.flink.streaming.timestamp.TimestampITCase.testWatermarkPropagationNoFinalWatermarkOnStop(TimestampITCase.java:223)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution 
failed.
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:805)
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:751)
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:751)
at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346)
at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at 

[jira] [Created] (FLINK-3746) WebRuntimeMonitorITCase.testNoCopyFromJar failing intermittently

2016-04-12 Thread Todd Lisonbee (JIRA)
Todd Lisonbee created FLINK-3746:


 Summary: WebRuntimeMonitorITCase.testNoCopyFromJar failing 
intermittently
 Key: FLINK-3746
 URL: https://issues.apache.org/jira/browse/FLINK-3746
 Project: Flink
  Issue Type: Bug
Reporter: Todd Lisonbee
Priority: Minor



Test failed randomly in Travis,
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122624299/log.txt

Tests run: 5, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 13.127 sec <<< 
FAILURE! - in org.apache.flink.runtime.webmonitor.WebRuntimeMonitorITCase
testNoCopyFromJar(org.apache.flink.runtime.webmonitor.WebRuntimeMonitorITCase)  
Time elapsed: 0.124 sec  <<< FAILURE!
java.lang.AssertionError: expected:<200 OK> but was:<503 Service Unavailable>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:743)
at org.junit.Assert.assertEquals(Assert.java:118)
at org.junit.Assert.assertEquals(Assert.java:144)
at 
org.apache.flink.runtime.webmonitor.WebRuntimeMonitorITCase.testNoCopyFromJar(WebRuntimeMonitorITCase.java:456)


Results :

Failed tests: 
  WebRuntimeMonitorITCase.testNoCopyFromJar:456 expected:<200 OK> but was:<503 
Service Unavailable>



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (FLINK-3745) TimestampITCase testWatermarkPropagationNoFinalWatermarkOnStop failing intermittently

2016-04-12 Thread Todd Lisonbee (JIRA)
Todd Lisonbee created FLINK-3745:


 Summary: TimestampITCase 
testWatermarkPropagationNoFinalWatermarkOnStop failing intermittently
 Key: FLINK-3745
 URL: https://issues.apache.org/jira/browse/FLINK-3745
 Project: Flink
  Issue Type: Bug
Reporter: Todd Lisonbee
Priority: Minor


https://s3.amazonaws.com/archive.travis-ci.org/jobs/122624297/log.txt

{noformat}
java.lang.Exception: Stopping the job with ID ef892dfdf31b74a9a3da991d2240716e 
failed.
at 
org.apache.flink.runtime.minicluster.LocalFlinkMiniCluster.stopJob(LocalFlinkMiniCluster.scala:283)
at 
org.apache.flink.streaming.timestamp.TimestampITCase$1.run(TimestampITCase.java:213)
Caused by: java.lang.IllegalStateException: Job with ID 
ef892dfdf31b74a9a3da991d2240716e is in state FAILING but stopping is only 
allowed in state RUNNING.
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1.applyOrElse(JobManager.scala:577)
at 
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at 
org.apache.flink.runtime.testingUtils.TestingJobManagerLike$$anonfun$handleTestingMessage$1.applyOrElse(TestingJobManagerLike.scala:90)
at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
at 
org.apache.flink.runtime.LeaderSessionMessageFilter$$anonfun$receive$1.applyOrElse(LeaderSessionMessageFilter.scala:36)
at 
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at 
org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:33)
at 
org.apache.flink.runtime.LogMessages$$anon$1.apply(LogMessages.scala:28)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
at 
org.apache.flink.runtime.LogMessages$$anon$1.applyOrElse(LogMessages.scala:28)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at 
org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:113)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254)
at akka.dispatch.Mailbox.run(Mailbox.scala:221)
at akka.dispatch.Mailbox.exec(Mailbox.scala:231)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at 
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at 
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Tests run: 12, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 15.087 sec <<< 
FAILURE! - in org.apache.flink.streaming.timestamp.TimestampITCase
testWatermarkPropagationNoFinalWatermarkOnStop(org.apache.flink.streaming.timestamp.TimestampITCase)
  Time elapsed: 0.792 sec  <<< ERROR!
org.apache.flink.client.program.ProgramInvocationException: The program 
execution failed: Job execution failed.
at org.apache.flink.client.program.Client.runBlocking(Client.java:381)
at org.apache.flink.client.program.Client.runBlocking(Client.java:355)
at org.apache.flink.client.program.Client.runBlocking(Client.java:348)
at 
org.apache.flink.streaming.api.environment.RemoteStreamEnvironment.executeRemotely(RemoteStreamEnvironment.java:206)
at 
org.apache.flink.streaming.api.environment.RemoteStreamEnvironment.execute(RemoteStreamEnvironment.java:172)
at 
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1170)
at 
org.apache.flink.streaming.timestamp.TimestampITCase.testWatermarkPropagationNoFinalWatermarkOnStop(TimestampITCase.java:223)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution 
failed.
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply$mcV$sp(JobManager.scala:805)
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:751)
at 
org.apache.flink.runtime.jobmanager.JobManager$$anonfun$handleMessage$1$$anonfun$applyOrElse$7.apply(JobManager.scala:751)
at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at 
scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:41)
at 
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:401)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253)
at 

[jira] [Updated] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3741:
-
Description: 
Build failed on one of my pull requests and at least 3 others from other people.

Seems like problem is in latest master as of 4/11/2016 (my pull request only 
had a Javadoc comment added).

OpenJDK 7, hadoop.profile=1 and other profiles

https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589293/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589296/log.txt

Error:
[INFO] 
/home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
info: compiling
[INFO] Compiling 43 source files to 
/home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
1460421450188
[ERROR] error: error while loading , error in opening zip file
[ERROR] error: scala.reflect.internal.MissingRequirementError: object 
scala.runtime in compiler mirror not found.
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
[INFO]  at scala.tools.nsc.Global$Run.(Global.scala:1290)
[INFO]  at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
[INFO]  at scala.tools.nsc.Main$.doCompile(Main.scala:79)
[INFO]  at scala.tools.nsc.Driver.process(Driver.scala:54)
[INFO]  at scala.tools.nsc.Driver.main(Driver.scala:67)
[INFO]  at scala.tools.nsc.Main.main(Main.scala)
[INFO]  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[INFO]  at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
[INFO]  at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[INFO]  at java.lang.reflect.Method.invoke(Method.java:606)
[INFO]  at 
org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
[INFO]  at 
org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)

  was:
Build failed on one of my pull requests and at least 3 others from other people.

Seems like problem is in latest master as of 4/11/2016 (my pull request only 
had a Javadoc comment added).

OpenJDK 7, hadoop.profile=1

https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589293/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589296/log.txt

Error:
[INFO] 
/home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
info: compiling
[INFO] Compiling 43 source files to 
/home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
1460421450188
[ERROR] error: error while loading , error in opening zip file
[ERROR] error: scala.reflect.internal.MissingRequirementError: object 
scala.runtime in compiler mirror not found.
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR] at 

[jira] [Updated] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3741:
-
Description: 
Build failed on one of my pull requests and at least 3 others from other people.

Seems like problem is in latest master as of 4/11/2016 (my pull request only 
had a Javadoc comment added).

OpenJDK 7, hadoop.profile=1

https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589293/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589296/log.txt

Error:
[INFO] 
/home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
info: compiling
[INFO] Compiling 43 source files to 
/home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
1460421450188
[ERROR] error: error while loading , error in opening zip file
[ERROR] error: scala.reflect.internal.MissingRequirementError: object 
scala.runtime in compiler mirror not found.
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
[INFO]  at scala.tools.nsc.Global$Run.(Global.scala:1290)
[INFO]  at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
[INFO]  at scala.tools.nsc.Main$.doCompile(Main.scala:79)
[INFO]  at scala.tools.nsc.Driver.process(Driver.scala:54)
[INFO]  at scala.tools.nsc.Driver.main(Driver.scala:67)
[INFO]  at scala.tools.nsc.Main.main(Main.scala)
[INFO]  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[INFO]  at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
[INFO]  at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[INFO]  at java.lang.reflect.Method.invoke(Method.java:606)
[INFO]  at 
org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
[INFO]  at 
org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)

  was:
Build failed on one of my pull requests and one from someone else.

Seems like problem is in latest master as of 4/11/2016 (my pull request only 
had a Javadoc comment added).

OpenJDK 7, hadoop.profile=1

https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589293/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589296/log.txt

Error:
[INFO] 
/home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
info: compiling
[INFO] Compiling 43 source files to 
/home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
1460421450188
[ERROR] error: error while loading , error in opening zip file
[ERROR] error: scala.reflect.internal.MissingRequirementError: object 
scala.runtime in compiler mirror not found.
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[INFO] 

[jira] [Updated] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3741:
-
Description: 
Build failed on one of my pull requests and one from someone else.

Seems like problem is in latest master as of 4/11/2016 (my pull request only 
had a Javadoc comment added).

OpenJDK 7, hadoop.profile=1

https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589293/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122589296/log.txt

Error:
[INFO] 
/home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
info: compiling
[INFO] Compiling 43 source files to 
/home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
1460421450188
[ERROR] error: error while loading , error in opening zip file
[ERROR] error: scala.reflect.internal.MissingRequirementError: object 
scala.runtime in compiler mirror not found.
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
[INFO]  at scala.tools.nsc.Global$Run.(Global.scala:1290)
[INFO]  at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
[INFO]  at scala.tools.nsc.Main$.doCompile(Main.scala:79)
[INFO]  at scala.tools.nsc.Driver.process(Driver.scala:54)
[INFO]  at scala.tools.nsc.Driver.main(Driver.scala:67)
[INFO]  at scala.tools.nsc.Main.main(Main.scala)
[INFO]  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[INFO]  at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
[INFO]  at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[INFO]  at java.lang.reflect.Method.invoke(Method.java:606)
[INFO]  at 
org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
[INFO]  at 
org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)

  was:
Build failed on one of my pull requests and one from someone else.

Seems like problem is in latest master as of 4/11/2016 (my pull request only 
had a Javadoc comment added).

OpenJDK 7, hadoop.profile=1

https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt

Error:
[INFO] 
/home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
info: compiling
[INFO] Compiling 43 source files to 
/home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
1460421450188
[ERROR] error: error while loading , error in opening zip file
[ERROR] error: scala.reflect.internal.MissingRequirementError: object 
scala.runtime in compiler mirror not found.
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
[INFO]  at 

[jira] [Updated] (FLINK-3744) LocalFlinkMiniClusterITCase times out occasionally when building locally

2016-04-12 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3744:
-
Labels: build flaky-test  (was: flaky-test)

> LocalFlinkMiniClusterITCase times out occasionally when building locally
> 
>
> Key: FLINK-3744
> URL: https://issues.apache.org/jira/browse/FLINK-3744
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>Priority: Trivial
>  Labels: build, flaky-test
>
> When building locally 
> LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers 
> timedout.  
> Out of many local builds this has only happened to me once.  This test 
> immediately passed when I ran `mvn verify` a second time.
> Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 23.139 sec 
> <<< FAILURE! - in 
> org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase
> testLocalFlinkMiniClusterWithMultipleTaskManagers(org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase)
>   Time elapsed: 23.087 sec  <<< ERROR!
> java.util.concurrent.TimeoutException: Futures timed out after [1 
> milliseconds]
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
>   at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
>   at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
>   at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>   at scala.concurrent.Await$.ready(package.scala:86)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:455)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:439)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:330)
>   at 
> org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:269)
>   at 
> org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers(LocalFlinkMiniClusterITCase.java:73)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (FLINK-3744) LocalFlinkMiniClusterITCase times out occasionally when building locally

2016-04-12 Thread Todd Lisonbee (JIRA)
Todd Lisonbee created FLINK-3744:


 Summary: LocalFlinkMiniClusterITCase times out occasionally when 
building locally
 Key: FLINK-3744
 URL: https://issues.apache.org/jira/browse/FLINK-3744
 Project: Flink
  Issue Type: Bug
Reporter: Todd Lisonbee
Priority: Trivial


When building locally 
LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers 
timedout.  

Out of many local builds this has only happened to me once.

Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 23.139 sec <<< 
FAILURE! - in 
org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase
testLocalFlinkMiniClusterWithMultipleTaskManagers(org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase)
  Time elapsed: 23.087 sec  <<< ERROR!
java.util.concurrent.TimeoutException: Futures timed out after [1 
milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.ready(package.scala:86)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:455)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:439)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:330)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:269)
at 
org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers(LocalFlinkMiniClusterITCase.java:73)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3744) LocalFlinkMiniClusterITCase times out occasionally when building locally

2016-04-12 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3744?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3744:
-
Description: 
When building locally 
LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers 
timedout.  

Out of many local builds this has only happened to me once.  This test 
immediately passed when I ran `mvn verify` a second time.

Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 23.139 sec <<< 
FAILURE! - in 
org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase
testLocalFlinkMiniClusterWithMultipleTaskManagers(org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase)
  Time elapsed: 23.087 sec  <<< ERROR!
java.util.concurrent.TimeoutException: Futures timed out after [1 
milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.ready(package.scala:86)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:455)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:439)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:330)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:269)
at 
org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers(LocalFlinkMiniClusterITCase.java:73)

  was:
When building locally 
LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers 
timedout.  

Out of many local builds this has only happened to me once.

Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 23.139 sec <<< 
FAILURE! - in 
org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase
testLocalFlinkMiniClusterWithMultipleTaskManagers(org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase)
  Time elapsed: 23.087 sec  <<< ERROR!
java.util.concurrent.TimeoutException: Futures timed out after [1 
milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
at 
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.ready(package.scala:86)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:455)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.waitForTaskManagersToBeRegistered(FlinkMiniCluster.scala:439)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:330)
at 
org.apache.flink.runtime.minicluster.FlinkMiniCluster.start(FlinkMiniCluster.scala:269)
at 
org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers(LocalFlinkMiniClusterITCase.java:73)


> LocalFlinkMiniClusterITCase times out occasionally when building locally
> 
>
> Key: FLINK-3744
> URL: https://issues.apache.org/jira/browse/FLINK-3744
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>Priority: Trivial
>  Labels: flaky-test
>
> When building locally 
> LocalFlinkMiniClusterITCase.testLocalFlinkMiniClusterWithMultipleTaskManagers 
> timedout.  
> Out of many local builds this has only happened to me once.  This test 
> immediately passed when I ran `mvn verify` a second time.
> Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 23.139 sec 
> <<< FAILURE! - in 
> org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase
> testLocalFlinkMiniClusterWithMultipleTaskManagers(org.apache.flink.test.runtime.minicluster.LocalFlinkMiniClusterITCase)
>   Time elapsed: 23.087 sec  <<< ERROR!
> java.util.concurrent.TimeoutException: Futures timed out after [1 
> milliseconds]
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>   at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
>   at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:86)
>   at 

[jira] [Comment Edited] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15238012#comment-15238012
 ] 

Todd Lisonbee edited comment on FLINK-3741 at 4/12/16 9:28 PM:
---

Breeze upgrade seems easy.

I've submitted a pull request under FLINK-3743 
https://github.com/apache/flink/pull/1876

(I did it under another ticket because 1. I'm not positive it will fix this 
intermittent issue, 2. It doesn't hurt to upgrade to the latest breeze library 
anyway)


was (Author: tlisonbee):
Breeze upgrade seems easy.

I've submitted a pull request under FLINK-3743 
https://github.com/apache/flink/pull/1876

> Travis Compile Error: MissingRequirementError: object scala.runtime in 
> compiler mirror not found.
> -
>
> Key: FLINK-3741
> URL: https://issues.apache.org/jira/browse/FLINK-3741
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>  Labels: CI, build
>
> Build failed on one of my pull requests and one from someone else.
> Seems like problem is in latest master as of 4/11/2016 (my pull request only 
> had a Javadoc comment added).
> OpenJDK 7, hadoop.profile=1
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
> Error:
> [INFO] 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
> info: compiling
> [INFO] Compiling 43 source files to 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
> 1460421450188
> [ERROR] error: error while loading , error in opening zip file
> [ERROR] error: scala.reflect.internal.MissingRequirementError: object 
> scala.runtime in compiler mirror not found.
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> [INFO]at scala.tools.nsc.Global$Run.(Global.scala:1290)
> [INFO]at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
> [INFO]at scala.tools.nsc.Main$.doCompile(Main.scala:79)
> [INFO]at scala.tools.nsc.Driver.process(Driver.scala:54)
> [INFO]at scala.tools.nsc.Driver.main(Driver.scala:67)
> [INFO]at scala.tools.nsc.Main.main(Main.scala)
> [INFO]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [INFO]at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> [INFO]at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> [INFO]at java.lang.reflect.Method.invoke(Method.java:606)
> [INFO]at 
> org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
> [INFO]at 
> org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)



--
This message was sent by Atlassian JIRA

[jira] [Commented] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15238012#comment-15238012
 ] 

Todd Lisonbee commented on FLINK-3741:
--

Breeze upgrade seems easy.

I've submitted a pull request under FLINK-3743 
https://github.com/apache/flink/pull/1876

> Travis Compile Error: MissingRequirementError: object scala.runtime in 
> compiler mirror not found.
> -
>
> Key: FLINK-3741
> URL: https://issues.apache.org/jira/browse/FLINK-3741
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>  Labels: CI, build
>
> Build failed on one of my pull requests and one from someone else.
> Seems like problem is in latest master as of 4/11/2016 (my pull request only 
> had a Javadoc comment added).
> OpenJDK 7, hadoop.profile=1
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
> Error:
> [INFO] 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
> info: compiling
> [INFO] Compiling 43 source files to 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
> 1460421450188
> [ERROR] error: error while loading , error in opening zip file
> [ERROR] error: scala.reflect.internal.MissingRequirementError: object 
> scala.runtime in compiler mirror not found.
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> [INFO]at scala.tools.nsc.Global$Run.(Global.scala:1290)
> [INFO]at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
> [INFO]at scala.tools.nsc.Main$.doCompile(Main.scala:79)
> [INFO]at scala.tools.nsc.Driver.process(Driver.scala:54)
> [INFO]at scala.tools.nsc.Driver.main(Driver.scala:67)
> [INFO]at scala.tools.nsc.Main.main(Main.scala)
> [INFO]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [INFO]at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> [INFO]at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> [INFO]at java.lang.reflect.Method.invoke(Method.java:606)
> [INFO]at 
> org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
> [INFO]at 
> org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3743) Upgrade breeze from 0.11.2 to 0.12

2016-04-12 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3743?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3743:
-
Summary: Upgrade breeze from 0.11.2 to 0.12  (was: Upgrate breeze from 
0.11.2 to 0.12)

> Upgrade breeze from 0.11.2 to 0.12
> --
>
> Key: FLINK-3743
> URL: https://issues.apache.org/jira/browse/FLINK-3743
> Project: Flink
>  Issue Type: Task
>Reporter: Todd Lisonbee
>Priority: Minor
>
> Upgrade to the a new version of breeze that is available.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3743) Upgrate breeze from 0.11.2 to 0.12

2016-04-12 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3743?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15238003#comment-15238003
 ] 

Todd Lisonbee commented on FLINK-3743:
--

I don't know of any particular reason to do this except I believe it may be a 
workaround to this issue: FLINK-3741

> Upgrate breeze from 0.11.2 to 0.12
> --
>
> Key: FLINK-3743
> URL: https://issues.apache.org/jira/browse/FLINK-3743
> Project: Flink
>  Issue Type: Task
>Reporter: Todd Lisonbee
>Priority: Minor
>
> Upgrade to the a new version of breeze that is available.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3737) WikipediaEditsSourceTest.testWikipediaEditsSource() fails locally

2016-04-12 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3737?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15237897#comment-15237897
 ] 

Todd Lisonbee commented on FLINK-3737:
--

Yes, that makes sense.  

Maybe with an Assume, 
http://junit.org/junit4/javadoc/latest/org/junit/Assume.html

I can implement that quickly.

> WikipediaEditsSourceTest.testWikipediaEditsSource() fails locally
> -
>
> Key: FLINK-3737
> URL: https://issues.apache.org/jira/browse/FLINK-3737
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
>Reporter: Todd Lisonbee
>Priority: Minor
>  Labels: test-stability
>
> This test fails for me locally from both Maven command line and IntelliJ.
> This is on latest master as of 4/11/2016, apache-maven-3.1.1, openjdk version 
> "1.8.0_51"
> (It might be because I am behind a proxy).
> Error message:
> java.lang.NullPointerException
>   at org.schwering.irc.lib.IRCConnection.send(IRCConnection.java:394)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditEventIrcStream.leave(WikipediaEditEventIrcStream.java:77)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSource.close(WikipediaEditsSource.java:84)
>   at 
> org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:45)
>   at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.dispose(AbstractUdfStreamOperator.java:107)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.disposeAllOperators(StreamTask.java:347)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:294)
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:579)
>   at java.lang.Thread.run(Thread.java:745)
> Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 64.749 sec 
> <<< FAILURE! - in 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest
> testWikipediaEditsSource(org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest)
>   Time elapsed: 64.744 sec  <<< FAILURE!
> org.junit.ComparisonFailure: expected:<[Expected test exception]> but 
> was:<[Connection timed out]>
>   at org.junit.Assert.assertEquals(Assert.java:115)
>   at org.junit.Assert.assertEquals(Assert.java:144)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest.testWikipediaEditsSource(WikipediaEditsSourceTest.java:53)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15237772#comment-15237772
 ] 

Todd Lisonbee commented on FLINK-3741:
--

I researched this some...

It sounds like if there are any corrupt jar files then Scala will give you this 
error message.  There is a related ticket for the Scala language, 
https://issues.scala-lang.org/browse/SI-5463

I didn't find any nice way to find corrupt jars in Maven, people either delete 
their .m2 repo or they issue a command like "find . -name "*jar" | xargs -L 1 
zip -T | grep error | grep invalid"
http://robbypelssers.blogspot.com/2010/07/finding-corrupt-jars-in-maven.html

A reasonable guess is this is from a corrupt breeze library.  

Reasoning:
- Any corrupt jar might cause error message
- Error is happening while compiling flink-ml source code
- breeze is the only extra library in this module that isn't in test scope or 
included somewhere else

If there is a corrupt breeze library cached somewhere, the easy workaround 
might be to upgrade breeze to the current version.  I'll try it quickly to see 
if that is an easy change.

> Travis Compile Error: MissingRequirementError: object scala.runtime in 
> compiler mirror not found.
> -
>
> Key: FLINK-3741
> URL: https://issues.apache.org/jira/browse/FLINK-3741
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>  Labels: CI, build
>
> Build failed on one of my pull requests and one from someone else.
> Seems like problem is in latest master as of 4/11/2016 (my pull request only 
> had a Javadoc comment added).
> OpenJDK 7, hadoop.profile=1
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
> Error:
> [INFO] 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
> info: compiling
> [INFO] Compiling 43 source files to 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
> 1460421450188
> [ERROR] error: error while loading , error in opening zip file
> [ERROR] error: scala.reflect.internal.MissingRequirementError: object 
> scala.runtime in compiler mirror not found.
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> [INFO]at scala.tools.nsc.Global$Run.(Global.scala:1290)
> [INFO]at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
> [INFO]at scala.tools.nsc.Main$.doCompile(Main.scala:79)
> [INFO]at scala.tools.nsc.Driver.process(Driver.scala:54)
> [INFO]at scala.tools.nsc.Driver.main(Driver.scala:67)
> [INFO]at scala.tools.nsc.Main.main(Main.scala)
> [INFO]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [INFO]at 
> 

[jira] [Commented] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15237612#comment-15237612
 ] 

Todd Lisonbee commented on FLINK-3741:
--

Does sound like broken artifact in Maven,
http://stackoverflow.com/questions/7600028/maven-error-in-opening-zip-file-when-running-maven

> Travis Compile Error: MissingRequirementError: object scala.runtime in 
> compiler mirror not found.
> -
>
> Key: FLINK-3741
> URL: https://issues.apache.org/jira/browse/FLINK-3741
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>Priority: Critical
>  Labels: CI, build
>
> Build failed on one of my pull requests and one from someone else.
> Seems like problem is in latest master as of 4/11/2016 (my pull request only 
> had a Javadoc comment added).
> OpenJDK 7, hadoop.profile=1
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
> Error:
> [INFO] 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
> info: compiling
> [INFO] Compiling 43 source files to 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
> 1460421450188
> [ERROR] error: error while loading , error in opening zip file
> [ERROR] error: scala.reflect.internal.MissingRequirementError: object 
> scala.runtime in compiler mirror not found.
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> [INFO]at scala.tools.nsc.Global$Run.(Global.scala:1290)
> [INFO]at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
> [INFO]at scala.tools.nsc.Main$.doCompile(Main.scala:79)
> [INFO]at scala.tools.nsc.Driver.process(Driver.scala:54)
> [INFO]at scala.tools.nsc.Driver.main(Driver.scala:67)
> [INFO]at scala.tools.nsc.Main.main(Main.scala)
> [INFO]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [INFO]at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> [INFO]at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> [INFO]at java.lang.reflect.Method.invoke(Method.java:606)
> [INFO]at 
> org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
> [INFO]at 
> org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3741:
-
Priority: Major  (was: Critical)

> Travis Compile Error: MissingRequirementError: object scala.runtime in 
> compiler mirror not found.
> -
>
> Key: FLINK-3741
> URL: https://issues.apache.org/jira/browse/FLINK-3741
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>  Labels: CI, build
>
> Build failed on one of my pull requests and one from someone else.
> Seems like problem is in latest master as of 4/11/2016 (my pull request only 
> had a Javadoc comment added).
> OpenJDK 7, hadoop.profile=1
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
> Error:
> [INFO] 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
> info: compiling
> [INFO] Compiling 43 source files to 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
> 1460421450188
> [ERROR] error: error while loading , error in opening zip file
> [ERROR] error: scala.reflect.internal.MissingRequirementError: object 
> scala.runtime in compiler mirror not found.
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> [INFO]at scala.tools.nsc.Global$Run.(Global.scala:1290)
> [INFO]at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
> [INFO]at scala.tools.nsc.Main$.doCompile(Main.scala:79)
> [INFO]at scala.tools.nsc.Driver.process(Driver.scala:54)
> [INFO]at scala.tools.nsc.Driver.main(Driver.scala:67)
> [INFO]at scala.tools.nsc.Main.main(Main.scala)
> [INFO]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [INFO]at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> [INFO]at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> [INFO]at java.lang.reflect.Method.invoke(Method.java:606)
> [INFO]at 
> org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
> [INFO]at 
> org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3741?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3741:
-
Labels: CI build  (was: )

> Travis Compile Error: MissingRequirementError: object scala.runtime in 
> compiler mirror not found.
> -
>
> Key: FLINK-3741
> URL: https://issues.apache.org/jira/browse/FLINK-3741
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>Priority: Critical
>  Labels: CI, build
>
> Build failed on one of my pull requests and one from someone else.
> Seems like problem is in latest master as of 4/11/2016 (my pull request only 
> had a Javadoc comment added).
> OpenJDK 7, hadoop.profile=1
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt
> Error:
> [INFO] 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
> info: compiling
> [INFO] Compiling 43 source files to 
> /home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
> 1460421450188
> [ERROR] error: error while loading , error in opening zip file
> [ERROR] error: scala.reflect.internal.MissingRequirementError: object 
> scala.runtime in compiler mirror not found.
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> [ERROR]   at 
> scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
> [INFO]at 
> scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
> [INFO]at 
> scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
> [INFO]at scala.tools.nsc.Global$Run.(Global.scala:1290)
> [INFO]at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
> [INFO]at scala.tools.nsc.Main$.doCompile(Main.scala:79)
> [INFO]at scala.tools.nsc.Driver.process(Driver.scala:54)
> [INFO]at scala.tools.nsc.Driver.main(Driver.scala:67)
> [INFO]at scala.tools.nsc.Main.main(Main.scala)
> [INFO]at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [INFO]at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> [INFO]at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> [INFO]at java.lang.reflect.Method.invoke(Method.java:606)
> [INFO]at 
> org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
> [INFO]at 
> org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (FLINK-3741) Travis Compile Error: MissingRequirementError: object scala.runtime in compiler mirror not found.

2016-04-12 Thread Todd Lisonbee (JIRA)
Todd Lisonbee created FLINK-3741:


 Summary: Travis Compile Error: MissingRequirementError: object 
scala.runtime in compiler mirror not found.
 Key: FLINK-3741
 URL: https://issues.apache.org/jira/browse/FLINK-3741
 Project: Flink
  Issue Type: Bug
Reporter: Todd Lisonbee
Priority: Critical


Build failed on one of my pull requests and one from someone else.

Seems like problem is in latest master as of 4/11/2016 (my pull request only 
had a Javadoc comment added).

OpenJDK 7, hadoop.profile=1

https://s3.amazonaws.com/archive.travis-ci.org/jobs/122460456/log.txt
https://s3.amazonaws.com/archive.travis-ci.org/jobs/122381837/log.txt

Error:
[INFO] 
/home/travis/build/apache/flink/flink-libraries/flink-ml/src/main/scala:-1: 
info: compiling
[INFO] Compiling 43 source files to 
/home/travis/build/apache/flink/flink-libraries/flink-ml/target/classes at 
1460421450188
[ERROR] error: error while loading , error in opening zip file
[ERROR] error: scala.reflect.internal.MissingRequirementError: object 
scala.runtime in compiler mirror not found.
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[ERROR] at 
scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
[INFO]  at 
scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:184)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1024)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1023)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1153)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
[INFO]  at 
scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
[INFO]  at scala.tools.nsc.Global$Run.(Global.scala:1290)
[INFO]  at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
[INFO]  at scala.tools.nsc.Main$.doCompile(Main.scala:79)
[INFO]  at scala.tools.nsc.Driver.process(Driver.scala:54)
[INFO]  at scala.tools.nsc.Driver.main(Driver.scala:67)
[INFO]  at scala.tools.nsc.Main.main(Main.scala)
[INFO]  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[INFO]  at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
[INFO]  at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[INFO]  at java.lang.reflect.Method.invoke(Method.java:606)
[INFO]  at 
org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
[INFO]  at 
org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3737) WikipediaEditsSourceTest.testWikipediaEditsSource() fails locally

2016-04-11 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3737?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15236140#comment-15236140
 ] 

Todd Lisonbee commented on FLINK-3737:
--

Submitted two pull requests, one to the website and one to the source code,
https://github.com/apache/flink-web/pull/18
https://github.com/apache/flink/pull/1872

Basically this test needs SOCKS proxy setup,
http://docs.oracle.com/javase/8/docs/technotes/guides/net/proxies.html

> WikipediaEditsSourceTest.testWikipediaEditsSource() fails locally
> -
>
> Key: FLINK-3737
> URL: https://issues.apache.org/jira/browse/FLINK-3737
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
>Reporter: Todd Lisonbee
>Priority: Minor
>  Labels: test-stability
>
> This test fails for me locally from both Maven command line and IntelliJ.
> This is on latest master as of 4/11/2016, apache-maven-3.1.1, openjdk version 
> "1.8.0_51"
> (It might be because I am behind a proxy).
> Error message:
> java.lang.NullPointerException
>   at org.schwering.irc.lib.IRCConnection.send(IRCConnection.java:394)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditEventIrcStream.leave(WikipediaEditEventIrcStream.java:77)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSource.close(WikipediaEditsSource.java:84)
>   at 
> org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:45)
>   at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.dispose(AbstractUdfStreamOperator.java:107)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.disposeAllOperators(StreamTask.java:347)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:294)
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:579)
>   at java.lang.Thread.run(Thread.java:745)
> Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 64.749 sec 
> <<< FAILURE! - in 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest
> testWikipediaEditsSource(org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest)
>   Time elapsed: 64.744 sec  <<< FAILURE!
> org.junit.ComparisonFailure: expected:<[Expected test exception]> but 
> was:<[Connection timed out]>
>   at org.junit.Assert.assertEquals(Assert.java:115)
>   at org.junit.Assert.assertEquals(Assert.java:144)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest.testWikipediaEditsSource(WikipediaEditsSourceTest.java:53)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3737) WikipediaEditsSourceTest.testWikipediaEditsSource() fails locally

2016-04-11 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3737?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15236093#comment-15236093
 ] 

Todd Lisonbee commented on FLINK-3737:
--

Yes, this was a proxy issue.  I'll document how to fix.

> WikipediaEditsSourceTest.testWikipediaEditsSource() fails locally
> -
>
> Key: FLINK-3737
> URL: https://issues.apache.org/jira/browse/FLINK-3737
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
>Reporter: Todd Lisonbee
>Priority: Minor
>  Labels: test-stability
>
> This test fails for me locally from both Maven command line and IntelliJ.
> This is on latest master as of 4/11/2016, apache-maven-3.1.1, openjdk version 
> "1.8.0_51"
> (It might be because I am behind a proxy).
> Error message:
> java.lang.NullPointerException
>   at org.schwering.irc.lib.IRCConnection.send(IRCConnection.java:394)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditEventIrcStream.leave(WikipediaEditEventIrcStream.java:77)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSource.close(WikipediaEditsSource.java:84)
>   at 
> org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:45)
>   at 
> org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.dispose(AbstractUdfStreamOperator.java:107)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.disposeAllOperators(StreamTask.java:347)
>   at 
> org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:294)
>   at org.apache.flink.runtime.taskmanager.Task.run(Task.java:579)
>   at java.lang.Thread.run(Thread.java:745)
> Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 64.749 sec 
> <<< FAILURE! - in 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest
> testWikipediaEditsSource(org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest)
>   Time elapsed: 64.744 sec  <<< FAILURE!
> org.junit.ComparisonFailure: expected:<[Expected test exception]> but 
> was:<[Connection timed out]>
>   at org.junit.Assert.assertEquals(Assert.java:115)
>   at org.junit.Assert.assertEquals(Assert.java:144)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest.testWikipediaEditsSource(WikipediaEditsSourceTest.java:53)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3737) WikipediaEditsSourceTest.testWikipediaEditsSource() fails locally

2016-04-11 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3737?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3737:
-
Description: 
This test fails for me locally from both Maven command line and IntelliJ.

This is on latest master as of 4/11/2016, apache-maven-3.1.1, openjdk version 
"1.8.0_51"

(It might be because I am behind a proxy).

Error message:

java.lang.NullPointerException
at org.schwering.irc.lib.IRCConnection.send(IRCConnection.java:394)
at 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditEventIrcStream.leave(WikipediaEditEventIrcStream.java:77)
at 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSource.close(WikipediaEditsSource.java:84)
at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:45)
at 
org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.dispose(AbstractUdfStreamOperator.java:107)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.disposeAllOperators(StreamTask.java:347)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:294)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:579)
at java.lang.Thread.run(Thread.java:745)
Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 64.749 sec <<< 
FAILURE! - in 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest
testWikipediaEditsSource(org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest)
  Time elapsed: 64.744 sec  <<< FAILURE!
org.junit.ComparisonFailure: expected:<[Expected test exception]> but 
was:<[Connection timed out]>
at org.junit.Assert.assertEquals(Assert.java:115)
at org.junit.Assert.assertEquals(Assert.java:144)
at 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest.testWikipediaEditsSource(WikipediaEditsSourceTest.java:53)


  was:
This test fails for me locally.  It might be because I am behind a proxy.

This is on latest master as of 4/11/2016, apache-maven-3.1.1, openjdk version 
"1.8.0_51"

Error message:

java.lang.NullPointerException
at org.schwering.irc.lib.IRCConnection.send(IRCConnection.java:394)
at 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditEventIrcStream.leave(WikipediaEditEventIrcStream.java:77)
at 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSource.close(WikipediaEditsSource.java:84)
at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:45)
at 
org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.dispose(AbstractUdfStreamOperator.java:107)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.disposeAllOperators(StreamTask.java:347)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:294)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:579)
at java.lang.Thread.run(Thread.java:745)
Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 64.749 sec <<< 
FAILURE! - in 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest
testWikipediaEditsSource(org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest)
  Time elapsed: 64.744 sec  <<< FAILURE!
org.junit.ComparisonFailure: expected:<[Expected test exception]> but 
was:<[Connection timed out]>
at org.junit.Assert.assertEquals(Assert.java:115)
at org.junit.Assert.assertEquals(Assert.java:144)
at 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest.testWikipediaEditsSource(WikipediaEditsSourceTest.java:53)



> WikipediaEditsSourceTest.testWikipediaEditsSource() fails locally
> -
>
> Key: FLINK-3737
> URL: https://issues.apache.org/jira/browse/FLINK-3737
> Project: Flink
>  Issue Type: Bug
>  Components: Tests
>Reporter: Todd Lisonbee
>Priority: Minor
>  Labels: test-stability
>
> This test fails for me locally from both Maven command line and IntelliJ.
> This is on latest master as of 4/11/2016, apache-maven-3.1.1, openjdk version 
> "1.8.0_51"
> (It might be because I am behind a proxy).
> Error message:
> java.lang.NullPointerException
>   at org.schwering.irc.lib.IRCConnection.send(IRCConnection.java:394)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditEventIrcStream.leave(WikipediaEditEventIrcStream.java:77)
>   at 
> org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSource.close(WikipediaEditsSource.java:84)
>   at 
> org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:45)
>   at 
> 

[jira] [Created] (FLINK-3737) WikipediaEditsSourceTest.testWikipediaEditsSource() fails locally

2016-04-11 Thread Todd Lisonbee (JIRA)
Todd Lisonbee created FLINK-3737:


 Summary: WikipediaEditsSourceTest.testWikipediaEditsSource() fails 
locally
 Key: FLINK-3737
 URL: https://issues.apache.org/jira/browse/FLINK-3737
 Project: Flink
  Issue Type: Bug
  Components: Tests
Reporter: Todd Lisonbee
Priority: Minor


This test fails for me locally.  It might be because I am behind a proxy.

This is on latest master as of 4/11/2016, apache-maven-3.1.1, openjdk version 
"1.8.0_51"

Error message:

java.lang.NullPointerException
at org.schwering.irc.lib.IRCConnection.send(IRCConnection.java:394)
at 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditEventIrcStream.leave(WikipediaEditEventIrcStream.java:77)
at 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSource.close(WikipediaEditsSource.java:84)
at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:45)
at 
org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.dispose(AbstractUdfStreamOperator.java:107)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.disposeAllOperators(StreamTask.java:347)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:294)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:579)
at java.lang.Thread.run(Thread.java:745)
Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 64.749 sec <<< 
FAILURE! - in 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest
testWikipediaEditsSource(org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest)
  Time elapsed: 64.744 sec  <<< FAILURE!
org.junit.ComparisonFailure: expected:<[Expected test exception]> but 
was:<[Connection timed out]>
at org.junit.Assert.assertEquals(Assert.java:115)
at org.junit.Assert.assertEquals(Assert.java:144)
at 
org.apache.flink.streaming.connectors.wikiedits.WikipediaEditsSourceTest.testWikipediaEditsSource(WikipediaEditsSourceTest.java:53)




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3530) Kafka09ITCase.testBigRecordJob fails on Travis

2016-04-11 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3530?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15235602#comment-15235602
 ] 

Todd Lisonbee commented on FLINK-3530:
--

This test failed for me too but with a different error,
https://s3.amazonaws.com/archive.travis-ci.org/jobs/121312995/log.txt

My error:
Tests run: 15, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 100.796 sec 
<<< FAILURE! - in org.apache.flink.streaming.connectors.kafka.Kafka09ITCase
testBigRecordJob(org.apache.flink.streaming.connectors.kafka.Kafka09ITCase)  
Time elapsed: 4.769 sec  <<< FAILURE!
java.lang.AssertionError: Test failed: The program execution failed: Job 
execution failed.
at org.junit.Assert.fail(Assert.java:88)
at org.apache.flink.test.util.TestUtils.tryExecute(TestUtils.java:41)
at 
org.apache.flink.streaming.connectors.kafka.KafkaConsumerTestBase.runBigRecordTestTopology(KafkaConsumerTestBase.java:927)
at 
org.apache.flink.streaming.connectors.kafka.Kafka09ITCase.testBigRecordJob(Kafka09ITCase.java:96)

Other error:
Tests run: 15, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 262.129 sec 
<<< FAILURE! - in org.apache.flink.streaming.connectors.kafka.Kafka09ITCase
testBigRecordJob(org.apache.flink.streaming.connectors.kafka.Kafka09ITCase)  
Time elapsed: 60.169 sec  <<< ERROR!
java.lang.Exception: test timed out after 6 milliseconds
at java.lang.Object.wait(Native Method)
at java.lang.Object.wait(Object.java:503)
at org.apache.zookeeper.ClientCnxn.submitRequest(ClientCnxn.java:1342)
at org.apache.zookeeper.ZooKeeper.create(ZooKeeper.java:781)
at org.I0Itec.zkclient.ZkConnection.create(ZkConnection.java:99)
at org.I0Itec.zkclient.ZkClient$3.call(ZkClient.java:529)
at org.I0Itec.zkclient.ZkClient$3.call(ZkClient.java:526)
at org.I0Itec.zkclient.ZkClient.retryUntilConnected(ZkClient.java:985)
at org.I0Itec.zkclient.ZkClient.create(ZkClient.java:526)
at org.I0Itec.zkclient.ZkClient.createPersistent(ZkClient.java:403)
at kafka.utils.ZkPath$.createPersistent(ZkUtils.scala:911)
at kafka.utils.ZkUtils.createPersistentPath(ZkUtils.scala:391)
at kafka.admin.AdminUtils$.deleteTopic(AdminUtils.scala:165)
at kafka.admin.AdminUtils.deleteTopic(AdminUtils.scala)
at 
org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironmentImpl.deleteTestTopic(KafkaTestEnvironmentImpl.java:274)
at 
org.apache.flink.streaming.connectors.kafka.KafkaTestBase.deleteTestTopic(KafkaTestBase.java:166)
at 
org.apache.flink.streaming.connectors.kafka.KafkaConsumerTestBase.runBigRecordTestTopology(KafkaConsumerTestBase.java:877)
at 
org.apache.flink.streaming.connectors.kafka.Kafka09ITCase.testBigRecordJob(Kafka09ITCase.java:96)

> Kafka09ITCase.testBigRecordJob fails on Travis
> --
>
> Key: FLINK-3530
> URL: https://issues.apache.org/jira/browse/FLINK-3530
> Project: Flink
>  Issue Type: Bug
>  Components: Kafka Connector
>Affects Versions: 1.0.0
>Reporter: Till Rohrmann
>  Labels: test-stability
>
> The test case {{Kafka09ITCase.testBigRecordJob}} failed on Travis.
> https://s3.amazonaws.com/archive.travis-ci.org/jobs/112049279/log.txt



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3716) Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass

2016-04-08 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3716?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15232384#comment-15232384
 ] 

Todd Lisonbee commented on FLINK-3716:
--

I opened a pull request with a fix.  

By decreasing the socket timeout the test passes for me locally and also runs a 
lot faster.  The other solution would be to increase the JUnit timeout but that 
seemed less desirable.  I'm still not sure why this test would pass on some 
systems but not mine.  I suspect the systems where it is passing have extra 
configuration.  But in any case this fix seemed valid.

https://github.com/apache/flink/pull/1864

> Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass
> 
>
> Key: FLINK-3716
> URL: https://issues.apache.org/jira/browse/FLINK-3716
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>  Labels: test-stability
>
> This is on the latest master 4/7/2016 with `mvn clean verify`.  
> Test also reliably fails running it directly from IntelliJ.
> Test has a 60 second timeout but it seems to need much more time to run (my 
> workstation has server class Xeon).
> Test 
> testFailOnNoBroker(org.apache.flink.streaming.connectors.kafka.Kafka08ITCase) 
> failed with:
> java.lang.Exception: test timed out after 6 milliseconds
>   at sun.nio.ch.Net.poll(Native Method)
>   at sun.nio.ch.SocketChannelImpl.poll(SocketChannelImpl.java:954)
>   at 
> sun.nio.ch.SocketAdaptor$SocketInputStream.read(SocketAdaptor.java:204)
>   at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
>   at 
> java.nio.channels.Channels$ReadableByteChannelImpl.read(Channels.java:385)
>   at kafka.utils.Utils$.read(Utils.scala:380)
>   at 
> kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:54)
>   at kafka.network.Receive$class.readCompletely(Transmission.scala:56)
>   at 
> kafka.network.BoundedByteBufferReceive.readCompletely(BoundedByteBufferReceive.scala:29)
>   at kafka.network.BlockingChannel.receive(BlockingChannel.scala:111)
>   at kafka.consumer.SimpleConsumer.liftedTree1$1(SimpleConsumer.scala:79)
>   at 
> kafka.consumer.SimpleConsumer.kafka$consumer$SimpleConsumer$$sendRequest(SimpleConsumer.scala:68)
>   at kafka.consumer.SimpleConsumer.send(SimpleConsumer.scala:91)
>   at kafka.javaapi.consumer.SimpleConsumer.send(SimpleConsumer.scala:68)
>   at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:521)
>   at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.(FlinkKafkaConsumer08.java:218)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironmentImpl.getConsumer(KafkaTestEnvironmentImpl.java:95)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment.getConsumer(KafkaTestEnvironment.java:65)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment.getConsumer(KafkaTestEnvironment.java:73)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaConsumerTestBase.runFailOnNoBrokerTest(KafkaConsumerTestBase.java:155)
>   at 
> org.apache.flink.streaming.connectors.kafka.Kafka08ITCase.testFailOnNoBroker(Kafka08ITCase.java:54)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3716) Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass

2016-04-08 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3716?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3716:
-
Labels: test-stability  (was: )

> Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass
> 
>
> Key: FLINK-3716
> URL: https://issues.apache.org/jira/browse/FLINK-3716
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>  Labels: test-stability
>
> This is on the latest master 4/7/2016 with `mvn clean verify`.  
> Test also reliably fails running it directly from IntelliJ.
> Test has a 60 second timeout but it seems to need much more time to run (my 
> workstation has server class Xeon).
> Test 
> testFailOnNoBroker(org.apache.flink.streaming.connectors.kafka.Kafka08ITCase) 
> failed with:
> java.lang.Exception: test timed out after 6 milliseconds
>   at sun.nio.ch.Net.poll(Native Method)
>   at sun.nio.ch.SocketChannelImpl.poll(SocketChannelImpl.java:954)
>   at 
> sun.nio.ch.SocketAdaptor$SocketInputStream.read(SocketAdaptor.java:204)
>   at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
>   at 
> java.nio.channels.Channels$ReadableByteChannelImpl.read(Channels.java:385)
>   at kafka.utils.Utils$.read(Utils.scala:380)
>   at 
> kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:54)
>   at kafka.network.Receive$class.readCompletely(Transmission.scala:56)
>   at 
> kafka.network.BoundedByteBufferReceive.readCompletely(BoundedByteBufferReceive.scala:29)
>   at kafka.network.BlockingChannel.receive(BlockingChannel.scala:111)
>   at kafka.consumer.SimpleConsumer.liftedTree1$1(SimpleConsumer.scala:79)
>   at 
> kafka.consumer.SimpleConsumer.kafka$consumer$SimpleConsumer$$sendRequest(SimpleConsumer.scala:68)
>   at kafka.consumer.SimpleConsumer.send(SimpleConsumer.scala:91)
>   at kafka.javaapi.consumer.SimpleConsumer.send(SimpleConsumer.scala:68)
>   at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:521)
>   at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.(FlinkKafkaConsumer08.java:218)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironmentImpl.getConsumer(KafkaTestEnvironmentImpl.java:95)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment.getConsumer(KafkaTestEnvironment.java:65)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment.getConsumer(KafkaTestEnvironment.java:73)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaConsumerTestBase.runFailOnNoBrokerTest(KafkaConsumerTestBase.java:155)
>   at 
> org.apache.flink.streaming.connectors.kafka.Kafka08ITCase.testFailOnNoBroker(Kafka08ITCase.java:54)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3716) Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass

2016-04-07 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3716?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15231380#comment-15231380
 ] 

Todd Lisonbee commented on FLINK-3716:
--

I enabled more logging to see what was happening during the lost 3 minutes.  It 
appears to be trying multiple times and timing out each time.  This appears to 
me to be reasonable and likely the intended behavior but I'm not sure why this 
would be failing on my system but not the build server.  Seems like either I'm 
missing some needed settings or there are Linux settings that are different on 
my machine.

3184 [main] INFO  org.apache.flink.streaming.connectors.kafka.Kafka08ITCase  - 

Test 
testFailOnNoBroker(org.apache.flink.streaming.connectors.kafka.Kafka08ITCase) 
is running.

3205 [Thread-14] INFO  
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08  - Trying to 
get topic metadata from broker localhost:80 in try 0/3
33255 [Thread-14] INFO  kafka.consumer.SimpleConsumer  - Reconnect due to 
socket error: java.net.SocketTimeoutException
63290 [Thread-14] WARN  
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08  - Error 
communicating with broker localhost:80 to find partitions for 
[doesntexist].class java.net.SocketTimeoutException. Message: null
63791 [Thread-14] INFO  
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08  - Trying to 
get topic metadata from broker localhost:80 in try 1/3
93823 [Thread-14] INFO  kafka.consumer.SimpleConsumer  - Reconnect due to 
socket error: java.net.SocketTimeoutException
123849 [Thread-14] WARN  
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08  - Error 
communicating with broker localhost:80 to find partitions for 
[doesntexist].class java.net.SocketTimeoutException. Message: null
124349 [Thread-14] INFO  
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08  - Trying to 
get topic metadata from broker localhost:80 in try 2/3
154378 [Thread-14] INFO  kafka.consumer.SimpleConsumer  - Reconnect due to 
socket error: java.net.SocketTimeoutException
184382 [Thread-14] WARN  
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08  - Error 
communicating with broker localhost:80 to find partitions for 
[doesntexist].class java.net.SocketTimeoutException. Message: null
184886 [main] INFO  org.apache.flink.streaming.connectors.kafka.Kafka08ITCase  
- 

Test 
testFailOnNoBroker(org.apache.flink.streaming.connectors.kafka.Kafka08ITCase) 
successfully run.

184888 [main] INFO  org.apache.flink.streaming.connectors.kafka.KafkaTestBase  
- -
184888 [main] INFO  org.apache.flink.streaming.connectors.kafka.KafkaTestBase  
- Shut down KafkaTestBase 
184888 [main] INFO  org.apache.flink.streaming.connectors.kafka.KafkaTestBase  
- -
184900 [flink-akka.actor.default-dispatcher-3] INFO  
org.apache.flink.runtime.testingUtils.TestingTaskManager  - Stopping 
TaskManager akka://flink/user/taskmanager0#1733825815.
184900 [flink-akka.actor.default-dispatcher-4] INFO  
org.apache.flink.runtime.testingUtils.TestingJobManager  - Stopping JobManager 
akka.tcp://flink@127.0.0.1:1424/user/jobmanager.
184901 [flink-akka.actor.default-dispatcher-3] INFO  
org.apache.flink.runtime.testingUtils.TestingTaskManager  - Disassociating from 
JobManager
184903 [flink-akka.actor.default-dispatcher-3] INFO  
org.apache.flink.runtime.blob.BlobCache  - Shutting down BlobCache

> Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass
> 
>
> Key: FLINK-3716
> URL: https://issues.apache.org/jira/browse/FLINK-3716
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>
> This is on the latest master 4/7/2016 with `mvn clean verify`.  
> Test also reliably fails running it directly from IntelliJ.
> Test has a 60 second timeout but it seems to need much more time to run (my 
> workstation has server class Xeon).
> Test 
> testFailOnNoBroker(org.apache.flink.streaming.connectors.kafka.Kafka08ITCase) 
> failed with:
> java.lang.Exception: test timed out after 6 milliseconds
>   at sun.nio.ch.Net.poll(Native Method)
>   at sun.nio.ch.SocketChannelImpl.poll(SocketChannelImpl.java:954)
>   at 
> sun.nio.ch.SocketAdaptor$SocketInputStream.read(SocketAdaptor.java:204)
>   at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
>   at 
> 

[jira] [Commented] (FLINK-3716) Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass

2016-04-07 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3716?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15231041#comment-15231041
 ] 

Todd Lisonbee commented on FLINK-3716:
--

Most time is being spent at this point in the logs.  Seems like the test moves 
along fine for the first 3 seconds and then it hangs for about 3 minutes and 
then it finishes in a few more seconds.

3200 [flink-akka.actor.default-dispatcher-3] INFO  
org.apache.flink.runtime.testingUtils.TestingTaskManager  - Memory usage stats: 
[HEAP: 127/305/711 MB, NON HEAP: 45/46/-1 MB (used/committed/max)]
3210 [flink-akka.actor.default-dispatcher-3] INFO  
org.apache.flink.runtime.testingUtils.TestingTaskManager  - Trying to register 
at JobManager akka.tcp://flink@127.0.0.1:1424/user/jobmanager (attempt 1, 
timeout: 500 milliseconds)
3237 [flink-akka.actor.default-dispatcher-2] INFO  
org.apache.flink.runtime.instance.InstanceManager  - Registered TaskManager at 
gao-wse (akka.tcp://flink@127.0.0.1:1524/user/taskmanager0) as 
9a1f5ef24c0272e4bf0d6aca0697cbb1. Current number of registered hosts is 1. 
Current number of alive task slots is 8.
3248 [flink-akka.actor.default-dispatcher-3] INFO  
org.apache.flink.runtime.testingUtils.TestingTaskManager  - Successful 
registration at JobManager (akka.tcp://flink@127.0.0.1:1424/user/jobmanager), 
starting network stack and library cache.
3262 [flink-akka.actor.default-dispatcher-3] INFO  
org.apache.flink.runtime.testingUtils.TestingTaskManager  - Determined BLOB 
server address to be /127.0.0.1:59077. Starting BLOB cache.
3263 [flink-akka.actor.default-dispatcher-3] INFO  
org.apache.flink.runtime.blob.BlobCache  - Created BLOB cache storage directory 
/tmp/blobStore-0860fd56-6531-451a-b46c-996722835a6a
3271 [main] INFO  org.apache.flink.streaming.connectors.kafka.Kafka08ITCase  - 

Test 
testFailOnNoBroker(org.apache.flink.streaming.connectors.kafka.Kafka08ITCase) 
is running.

184995 [flink-akka.actor.default-dispatcher-2] INFO  
org.apache.flink.runtime.testingUtils.TestingJobManager  - Stopping JobManager 
akka.tcp://flink@127.0.0.1:1424/user/jobmanager.
184995 [flink-akka.actor.default-dispatcher-5] INFO  
org.apache.flink.runtime.testingUtils.TestingTaskManager  - Stopping 
TaskManager akka://flink/user/taskmanager0#770941742.
184997 [flink-akka.actor.default-dispatcher-5] INFO  
org.apache.flink.runtime.testingUtils.TestingTaskManager  - Disassociating from 
JobManager
185001 [flink-akka.actor.default-dispatcher-5] INFO  
org.apache.flink.runtime.blob.BlobCache  - Shutting down BlobCache
185011 [flink-akka.actor.default-dispatcher-2] INFO  
org.apache.flink.runtime.blob.BlobServer  - Stopped BLOB server at 0.0.0.0:59077
185015 [flink-akka.actor.default-dispatcher-5] INFO  
org.apache.flink.runtime.io.disk.iomanager.IOManager  - I/O manager removed 
spill file directory /tmp/flink-io-b118915b-de93-4164-aad2-b5ff9ecc4c5e
185019 [flink-akka.actor.default-dispatcher-5] INFO  
org.apache.flink.runtime.testingUtils.TestingTaskManager  - Task manager 
akka://flink/user/taskmanager0 is completely shut down.
185030 [main] INFO  kafka.server.KafkaServer  - [Kafka Server 0], shutting down

> Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass
> 
>
> Key: FLINK-3716
> URL: https://issues.apache.org/jira/browse/FLINK-3716
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>
> This is on the latest master 4/7/2016 with `mvn clean verify`.  
> Test also reliably fails running it directly from IntelliJ.
> Test has a 60 second timeout but it seems to need much more time to run (my 
> workstation has server class Xeon).
> Test 
> testFailOnNoBroker(org.apache.flink.streaming.connectors.kafka.Kafka08ITCase) 
> failed with:
> java.lang.Exception: test timed out after 6 milliseconds
>   at sun.nio.ch.Net.poll(Native Method)
>   at sun.nio.ch.SocketChannelImpl.poll(SocketChannelImpl.java:954)
>   at 
> sun.nio.ch.SocketAdaptor$SocketInputStream.read(SocketAdaptor.java:204)
>   at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
>   at 
> java.nio.channels.Channels$ReadableByteChannelImpl.read(Channels.java:385)
>   at kafka.utils.Utils$.read(Utils.scala:380)
>   at 
> kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:54)
>   at kafka.network.Receive$class.readCompletely(Transmission.scala:56)
>   at 
> kafka.network.BoundedByteBufferReceive.readCompletely(BoundedByteBufferReceive.scala:29)
>   at kafka.network.BlockingChannel.receive(BlockingChannel.scala:111)
>   at 

[jira] [Commented] (FLINK-3716) Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass

2016-04-07 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3716?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15231024#comment-15231024
 ] 

Todd Lisonbee commented on FLINK-3716:
--

Increasing timeout from 60 seconds to 5 minutes allowed the test to pass on my 
machine.

I'm going to play with it a little more and possibly open a pull request if it 
seems like a safe change.

> Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass
> 
>
> Key: FLINK-3716
> URL: https://issues.apache.org/jira/browse/FLINK-3716
> Project: Flink
>  Issue Type: Bug
>Reporter: Todd Lisonbee
>
> This is on the latest master 4/7/2016 with `mvn clean verify`.  
> Test also reliably fails running it directly from IntelliJ.
> Test has a 60 second timeout but it seems to need much more time to run (my 
> workstation has server class Xeon).
> Test 
> testFailOnNoBroker(org.apache.flink.streaming.connectors.kafka.Kafka08ITCase) 
> failed with:
> java.lang.Exception: test timed out after 6 milliseconds
>   at sun.nio.ch.Net.poll(Native Method)
>   at sun.nio.ch.SocketChannelImpl.poll(SocketChannelImpl.java:954)
>   at 
> sun.nio.ch.SocketAdaptor$SocketInputStream.read(SocketAdaptor.java:204)
>   at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
>   at 
> java.nio.channels.Channels$ReadableByteChannelImpl.read(Channels.java:385)
>   at kafka.utils.Utils$.read(Utils.scala:380)
>   at 
> kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:54)
>   at kafka.network.Receive$class.readCompletely(Transmission.scala:56)
>   at 
> kafka.network.BoundedByteBufferReceive.readCompletely(BoundedByteBufferReceive.scala:29)
>   at kafka.network.BlockingChannel.receive(BlockingChannel.scala:111)
>   at kafka.consumer.SimpleConsumer.liftedTree1$1(SimpleConsumer.scala:79)
>   at 
> kafka.consumer.SimpleConsumer.kafka$consumer$SimpleConsumer$$sendRequest(SimpleConsumer.scala:68)
>   at kafka.consumer.SimpleConsumer.send(SimpleConsumer.scala:91)
>   at kafka.javaapi.consumer.SimpleConsumer.send(SimpleConsumer.scala:68)
>   at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:521)
>   at 
> org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.(FlinkKafkaConsumer08.java:218)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironmentImpl.getConsumer(KafkaTestEnvironmentImpl.java:95)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment.getConsumer(KafkaTestEnvironment.java:65)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment.getConsumer(KafkaTestEnvironment.java:73)
>   at 
> org.apache.flink.streaming.connectors.kafka.KafkaConsumerTestBase.runFailOnNoBrokerTest(KafkaConsumerTestBase.java:155)
>   at 
> org.apache.flink.streaming.connectors.kafka.Kafka08ITCase.testFailOnNoBroker(Kafka08ITCase.java:54)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3716) Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass

2016-04-07 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3716?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3716:
-
Description: 
This is on the latest master 4/7/2016 with `mvn clean verify`.  

Test also reliably fails running it directly from IntelliJ.

Test has a 60 second timeout but it seems to need much more time to run (my 
workstation has server class Xeon).


Test 
testFailOnNoBroker(org.apache.flink.streaming.connectors.kafka.Kafka08ITCase) 
failed with:
java.lang.Exception: test timed out after 6 milliseconds
at sun.nio.ch.Net.poll(Native Method)
at sun.nio.ch.SocketChannelImpl.poll(SocketChannelImpl.java:954)
at 
sun.nio.ch.SocketAdaptor$SocketInputStream.read(SocketAdaptor.java:204)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
at 
java.nio.channels.Channels$ReadableByteChannelImpl.read(Channels.java:385)
at kafka.utils.Utils$.read(Utils.scala:380)
at 
kafka.network.BoundedByteBufferReceive.readFrom(BoundedByteBufferReceive.scala:54)
at kafka.network.Receive$class.readCompletely(Transmission.scala:56)
at 
kafka.network.BoundedByteBufferReceive.readCompletely(BoundedByteBufferReceive.scala:29)
at kafka.network.BlockingChannel.receive(BlockingChannel.scala:111)
at kafka.consumer.SimpleConsumer.liftedTree1$1(SimpleConsumer.scala:79)
at 
kafka.consumer.SimpleConsumer.kafka$consumer$SimpleConsumer$$sendRequest(SimpleConsumer.scala:68)
at kafka.consumer.SimpleConsumer.send(SimpleConsumer.scala:91)
at kafka.javaapi.consumer.SimpleConsumer.send(SimpleConsumer.scala:68)
at 
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.getPartitionsForTopic(FlinkKafkaConsumer08.java:521)
at 
org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.(FlinkKafkaConsumer08.java:218)
at 
org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironmentImpl.getConsumer(KafkaTestEnvironmentImpl.java:95)
at 
org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment.getConsumer(KafkaTestEnvironment.java:65)
at 
org.apache.flink.streaming.connectors.kafka.KafkaTestEnvironment.getConsumer(KafkaTestEnvironment.java:73)
at 
org.apache.flink.streaming.connectors.kafka.KafkaConsumerTestBase.runFailOnNoBrokerTest(KafkaConsumerTestBase.java:155)
at 
org.apache.flink.streaming.connectors.kafka.Kafka08ITCase.testFailOnNoBroker(Kafka08ITCase.java:54)


  was:
This is on the latest master 4/7/2016 with `mvn clean verify`.  

Test also reliably fails running it directly from IntelliJ.

Test has a 60 second timeout but it seems to need much more time to run (my 
workstation has server class Xeon).

---

/usr/lib/jvm/java-1.8.0-openjdk.x86_64/bin/java -ea -DforkNumber=01 -Xms256m 
-Xmx800m -Dlog4j.configuration=log4j-test.properties -Dmvn.forkNumber=1 
-XX:-UseGCOverheadLimit -Didea.launcher.port=7546 
-Didea.launcher.bin.path=/home/iauser/bin/idea-IU-141.1532.4/bin 
-Dfile.encoding=UTF-8 -classpath 

[jira] [Created] (FLINK-3716) Kafka08ITCase.testFailOnNoBroker() timing out before it has a chance to pass

2016-04-07 Thread Todd Lisonbee (JIRA)
Todd Lisonbee created FLINK-3716:


 Summary: Kafka08ITCase.testFailOnNoBroker() timing out before it 
has a chance to pass
 Key: FLINK-3716
 URL: https://issues.apache.org/jira/browse/FLINK-3716
 Project: Flink
  Issue Type: Bug
Reporter: Todd Lisonbee


This is on the latest master 4/7/2016 with `mvn clean verify`.  

Test also reliably fails running it directly from IntelliJ.

Test has a 60 second timeout but it seems to need much more time to run (my 
workstation has server class Xeon).

---

/usr/lib/jvm/java-1.8.0-openjdk.x86_64/bin/java -ea -DforkNumber=01 -Xms256m 
-Xmx800m -Dlog4j.configuration=log4j-test.properties -Dmvn.forkNumber=1 
-XX:-UseGCOverheadLimit -Didea.launcher.port=7546 
-Didea.launcher.bin.path=/home/iauser/bin/idea-IU-141.1532.4/bin 
-Dfile.encoding=UTF-8 -classpath 

[jira] [Commented] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-04-06 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15229441#comment-15229441
 ] 

Todd Lisonbee commented on FLINK-3664:
--

I figured out my build problem (I missed an Apache header on one file).

I went ahead and closed the original pull request so that I could open a new 
one with a clean commit history.

Here is the new one,
https://github.com/apache/flink/pull/1859

I think it is ready to merge, we'll see if Travis agrees.  Thanks.

> Create a method to easily Summarize a DataSet
> -
>
> Key: FLINK-3664
> URL: https://issues.apache.org/jira/browse/FLINK-3664
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
> Attachments: DataSet-Summary-Design-March2016-v1.txt
>
>
> Here is an example:
> {code}
> /**
>  * Summarize a DataSet of Tuples by collecting single pass statistics for all 
> columns
>  */
> public Tuple summarize()
> Dataset> input = // [...]
> Tuple3 summary 
> = input.summarize()
> summary.getField(0).stddev()
> summary.getField(1).maxStringLength()
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-04-05 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15227311#comment-15227311
 ] 

Todd Lisonbee commented on FLINK-3664:
--

I didn't have Travis CI setup with my github account so I added another commit 
so it would kick off a build.

> Create a method to easily Summarize a DataSet
> -
>
> Key: FLINK-3664
> URL: https://issues.apache.org/jira/browse/FLINK-3664
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
> Attachments: DataSet-Summary-Design-March2016-v1.txt
>
>
> Here is an example:
> {code}
> /**
>  * Summarize a DataSet of Tuples by collecting single pass statistics for all 
> columns
>  */
> public Tuple summarize()
> Dataset> input = // [...]
> Tuple3 summary 
> = input.summarize()
> summary.getField(0).stddev()
> summary.getField(1).maxStringLength()
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-04-05 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15227214#comment-15227214
 ] 

Todd Lisonbee commented on FLINK-3664:
--

Pull request is ready.  Please let me know if you'd like to see any other 
changes before merging.  Thanks.

> Create a method to easily Summarize a DataSet
> -
>
> Key: FLINK-3664
> URL: https://issues.apache.org/jira/browse/FLINK-3664
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
> Attachments: DataSet-Summary-Design-March2016-v1.txt
>
>
> Here is an example:
> {code}
> /**
>  * Summarize a DataSet of Tuples by collecting single pass statistics for all 
> columns
>  */
> public Tuple summarize()
> Dataset> input = // [...]
> Tuple3 summary 
> = input.summarize()
> summary.getField(0).stddev()
> summary.getField(1).maxStringLength()
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-03-30 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15218418#comment-15218418
 ] 

Todd Lisonbee commented on FLINK-3664:
--

I've completed a first pass for this implementation and would like any early 
feedback,
https://github.com/tlisonbee/flink/commit/2a7ad55d704bd3188ea8ae4cbfb7f40319474eef

(the important changes you might want to look at are in Aggregator, 
NumericSummaryAggregator, and DataSetUtils)

My "to do" list before submitting pull request:
- Blanket the code with comments, unit tests, and integration tests
- Incorporate any early feedback

Tasks I was planning on doing under a follow-on JIRA (not part of initial pull 
request):
- Add support for more data types (unless any others seem like must-have, I can 
do now)
- Add a summarize() method for GroupedDataSets

Thanks.

> Create a method to easily Summarize a DataSet
> -
>
> Key: FLINK-3664
> URL: https://issues.apache.org/jira/browse/FLINK-3664
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
> Attachments: DataSet-Summary-Design-March2016-v1.txt
>
>
> Here is an example:
> {code}
> /**
>  * Summarize a DataSet of Tuples by collecting single pass statistics for all 
> columns
>  */
> public Tuple summarize()
> Dataset> input = // [...]
> Tuple3 summary 
> = input.summarize()
> summary.getField(0).stddev()
> summary.getField(1).maxStringLength()
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-03-24 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15210409#comment-15210409
 ] 

Todd Lisonbee commented on FLINK-3664:
--

Hi Fabian, thanks for the feedback.

Your first 3 comments all make sense - agreed.

On distinct counts, I thought about it but wasn't sure so I left it out for 
now.  For an approximate, the best idea I had was to choose some arbitrary 
number, maybe 100.  And then just report the exact number of distinct values if 
less than 100, or to say 100+ if greater than 100.  This would be nice for 
categorical variables that happen to have less than 100 different values.  But 
with enough rows and columns it could be expensive (even if Tuple is currently 
limited to 22) or at least relatively more expensive than the other 
calculations.  There isn't a perfect magic number.  I didn't like this idea all 
of the way.

Do you know of a nice way to approximate distinct counts?

Thanks.

> Create a method to easily Summarize a DataSet
> -
>
> Key: FLINK-3664
> URL: https://issues.apache.org/jira/browse/FLINK-3664
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
> Attachments: DataSet-Summary-Design-March2016-v1.txt
>
>
> Here is an example:
> {code}
> /**
>  * Summarize a DataSet of Tuples by collecting single pass statistics for all 
> columns
>  */
> public Tuple summarize()
> Dataset> input = // [...]
> Tuple3 summary 
> = input.summarize()
> summary.getField(0).stddev()
> summary.getField(1).maxStringLength()
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-03-23 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3664:
-
Description: 
Here is an example:

{code}

/**
 * Summarize a DataSet of Tuples by collecting single pass statistics for all 
columns
 */
public Tuple summarize()

Dataset> input = // [...]
Tuple3 summary = 
input.summarize()

summary.getField(0).stddev()
summary.getField(1).maxStringLength()

{code}

  was:
Here is an example:

/**
 * Summarize a DataSet of Tuples by collecting single pass statistics for all 
columns
 */
public Tuple summarize()

Dataset> input = // [...]
Tuple3 summary = 
input.summarize()

summary.getField(0).stddev()
summary.getField(1).maxStringLength()


> Create a method to easily Summarize a DataSet
> -
>
> Key: FLINK-3664
> URL: https://issues.apache.org/jira/browse/FLINK-3664
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
> Attachments: DataSet-Summary-Design-March2016-v1.txt
>
>
> Here is an example:
> {code}
> /**
>  * Summarize a DataSet of Tuples by collecting single pass statistics for all 
> columns
>  */
> public Tuple summarize()
> Dataset> input = // [...]
> Tuple3 summary 
> = input.summarize()
> summary.getField(0).stddev()
> summary.getField(1).maxStringLength()
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-03-23 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3664:
-
Description: 
Here is an example:

/**
 * Summarize a DataSet of Tuples by collecting single pass statistics for all 
columns
 */
public Tuple summarize()

Dataset> input = // [...]
Tuple3 summary = 
input.summarize()

summary.getField(0).stddev()
summary.getField(1).maxStringLength()

  was:
Here is an example:


/**
 * Summarize a DataSet of Tuples by collecting single pass statistics for all 
columns
 */
public Tuple summarize()

Dataset> input = // [...]
Tuple3 summary = 
input.summarize()

summary.getField(0).stddev()
summary.getField(1).maxStringLength()


> Create a method to easily Summarize a DataSet
> -
>
> Key: FLINK-3664
> URL: https://issues.apache.org/jira/browse/FLINK-3664
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
> Attachments: DataSet-Summary-Design-March2016-v1.txt
>
>
> Here is an example:
> /**
>  * Summarize a DataSet of Tuples by collecting single pass statistics for all 
> columns
>  */
> public Tuple summarize()
> Dataset> input = // [...]
> Tuple3 summary 
> = input.summarize()
> summary.getField(0).stddev()
> summary.getField(1).maxStringLength()



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3613) Add standard deviation, mean, variance to list of Aggregations

2016-03-23 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3613?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15208869#comment-15208869
 ] 

Todd Lisonbee commented on FLINK-3613:
--

I created another related JIRA FLINK-3664 with a design for a summarize() 
function.  

I think FLINK-3664 would be a better place for me to start than improving the 
existing aggregations.

> Add standard deviation, mean, variance to list of Aggregations
> --
>
> Key: FLINK-3613
> URL: https://issues.apache.org/jira/browse/FLINK-3613
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>Priority: Minor
> Attachments: DataSet-Aggregation-Design-March2016-v1.txt
>
>
> Implement standard deviation, mean, variance for 
> org.apache.flink.api.java.aggregation.Aggregations
> Ideally implementation should be single pass and numerically stable.
> References:
> "Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
> al, International Conference on Data Engineering 2012
> http://dl.acm.org/citation.cfm?id=2310392
> "The Kahan summation algorithm (also known as compensated summation) reduces 
> the numerical errors that occur when adding a sequence of finite precision 
> floating point numbers. Numerical errors arise due to truncation and 
> rounding. These errors can lead to numerical instability when calculating 
> variance."
> https://en.wikipedia.org/wiki/Kahan_summation_algorithm



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-03-23 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3664:
-
Attachment: DataSet-Summary-Design-March2016-v1.txt

Attached is a first revision of a design.

(I had started on another related ticket FLINK-3613 but I think FLINK-3664 
would be better to do first)

> Create a method to easily Summarize a DataSet
> -
>
> Key: FLINK-3664
> URL: https://issues.apache.org/jira/browse/FLINK-3664
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
> Attachments: DataSet-Summary-Design-March2016-v1.txt
>
>
> Here is an example:
> /**
>  * Summarize a DataSet of Tuples by collecting single pass statistics for all 
> columns
>  */
> public Tuple summarize()
> Dataset> input = // [...]
> Tuple3 summary 
> = input.summarize()
> summary.getField(0).stddev()
> summary.getField(1).maxStringLength()



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-03-23 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3664?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15208806#comment-15208806
 ] 

Todd Lisonbee commented on FLINK-3664:
--

I'm writing a design for this now.

> Create a method to easily Summarize a DataSet
> -
>
> Key: FLINK-3664
> URL: https://issues.apache.org/jira/browse/FLINK-3664
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>
> Here is an example:
> /**
>  * Summarize a DataSet of Tuples by collecting single pass statistics for all 
> columns
>  */
> public Tuple summarize()
> Dataset> input = // [...]
> Tuple3 summary 
> = input.summarize()
> summary.getField(0).stddev()
> summary.getField(1).maxStringLength()



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (FLINK-3664) Create a method to easily Summarize a DataSet

2016-03-23 Thread Todd Lisonbee (JIRA)
Todd Lisonbee created FLINK-3664:


 Summary: Create a method to easily Summarize a DataSet
 Key: FLINK-3664
 URL: https://issues.apache.org/jira/browse/FLINK-3664
 Project: Flink
  Issue Type: Improvement
Reporter: Todd Lisonbee


Here is an example:


/**
 * Summarize a DataSet of Tuples by collecting single pass statistics for all 
columns
 */
public Tuple summarize()

Dataset> input = // [...]
Tuple3 summary = 
input.summarize()

summary.getField(0).stddev()
summary.getField(1).maxStringLength()



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Comment Edited] (FLINK-3613) Add standard deviation, mean, variance to list of Aggregations

2016-03-22 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3613?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15207054#comment-15207054
 ] 

Todd Lisonbee edited comment on FLINK-3613 at 3/22/16 7:02 PM:
---

Attached is a design for improvements to DataSet.aggregate() needed to 
implement additional aggregations like Standard Deviation.

To maintain public API's it seems like the best path would be to have 
AggregateOperator implement CustomUnaryOperation but that seems weird because 
no other Operator is done that way.  But other options I see don't seem 
consistent with other Operators either.

I really could use some feedback on this.  Thanks.

Also, should I be posting this to the Dev mailing list?


was (Author: tlisonbee):
Attached is a design for improvements to DataSet.aggregate() needed to 
implement additional aggregations like Standard Deviation.

To maintain public API's it seems like the best path would be to have 
AggregateOperator implement CustomUnaryOperation but that seems weird because 
no other Operator is done that way.  But other options I see don't seem 
consistent with other Operators either.

I really could use some feedback on this.  Thanks.

> Add standard deviation, mean, variance to list of Aggregations
> --
>
> Key: FLINK-3613
> URL: https://issues.apache.org/jira/browse/FLINK-3613
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>Priority: Minor
> Attachments: DataSet-Aggregation-Design-March2016-v1.txt
>
>
> Implement standard deviation, mean, variance for 
> org.apache.flink.api.java.aggregation.Aggregations
> Ideally implementation should be single pass and numerically stable.
> References:
> "Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
> al, International Conference on Data Engineering 2012
> http://dl.acm.org/citation.cfm?id=2310392
> "The Kahan summation algorithm (also known as compensated summation) reduces 
> the numerical errors that occur when adding a sequence of finite precision 
> floating point numbers. Numerical errors arise due to truncation and 
> rounding. These errors can lead to numerical instability when calculating 
> variance."
> https://en.wikipedia.org/wiki/Kahan_summation_algorithm



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3613) Add standard deviation, mean, variance to list of Aggregations

2016-03-22 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3613?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3613:
-
Attachment: DataSet-Aggregation-Design-March2016-v1.txt

Attached is a design for improvements to DataSet.aggregate() needed to 
implement additional aggregations like Standard Deviation.

To maintain public API's it seems like the best path would be to have 
AggregateOperator implement CustomUnaryOperation but that seems weird because 
no other Operator is done that way.  But other options I see don't seem 
consistent with other Operators either.

I really could use some feedback on this.  Thanks.

> Add standard deviation, mean, variance to list of Aggregations
> --
>
> Key: FLINK-3613
> URL: https://issues.apache.org/jira/browse/FLINK-3613
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>Priority: Minor
> Attachments: DataSet-Aggregation-Design-March2016-v1.txt
>
>
> Implement standard deviation, mean, variance for 
> org.apache.flink.api.java.aggregation.Aggregations
> Ideally implementation should be single pass and numerically stable.
> References:
> "Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
> al, International Conference on Data Engineering 2012
> http://dl.acm.org/citation.cfm?id=2310392
> "The Kahan summation algorithm (also known as compensated summation) reduces 
> the numerical errors that occur when adding a sequence of finite precision 
> floating point numbers. Numerical errors arise due to truncation and 
> rounding. These errors can lead to numerical instability when calculating 
> variance."
> https://en.wikipedia.org/wiki/Kahan_summation_algorithm



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3613) Add standard deviation, mean, variance to list of Aggregations

2016-03-19 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3613?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15201538#comment-15201538
 ] 

Todd Lisonbee commented on FLINK-3613:
--

Sure, I'll create a design for this.  Thanks.

> Add standard deviation, mean, variance to list of Aggregations
> --
>
> Key: FLINK-3613
> URL: https://issues.apache.org/jira/browse/FLINK-3613
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>Priority: Minor
>
> Implement standard deviation, mean, variance for 
> org.apache.flink.api.java.aggregation.Aggregations
> Ideally implementation should be single pass and numerically stable.
> References:
> "Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
> al, International Conference on Data Engineering 2012
> http://dl.acm.org/citation.cfm?id=2310392
> "The Kahan summation algorithm (also known as compensated summation) reduces 
> the numerical errors that occur when adding a sequence of finite precision 
> floating point numbers. Numerical errors arise due to truncation and 
> rounding. These errors can lead to numerical instability when calculating 
> variance."
> https://en.wikipedia.org/wiki/Kahan_summation_algorithm



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3613) Add standard deviation, mean, variance to list of Aggregations

2016-03-19 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3613?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15200562#comment-15200562
 ] 

Todd Lisonbee commented on FLINK-3613:
--

I didn't find exact overlap (FLINK-2144 was similar except for Windows, 
FLINK-2379 is for vectors but isn't using above interface).

---

Implementing this isn't as easy as extending the existing AggregationFunction 
abstract class.  AggregationFunction works for Sum, Min, and Max but isn't 
general enough for other aggregations.

An aggregation should have three types:
1) the value type - the type being aggregated
2) the aggregate type - the intermediate type that carries all needed data for 
the aggregation
3) the result type - the result of the aggregation

For example, if you are aggregating doubles in different ways:
SUM - value type is double, aggregation type is double, result type is double
COUNT - value type is double, aggregation type is probably long, result type is 
long
STANDARD_DEVIATION - value type is double, aggregation type would be a complex 
type (count, mean, sum of squares differences from current mean, deltas), 
result type is double

> Add standard deviation, mean, variance to list of Aggregations
> --
>
> Key: FLINK-3613
> URL: https://issues.apache.org/jira/browse/FLINK-3613
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>Priority: Minor
>
> Implement standard deviation, mean, variance for 
> org.apache.flink.api.java.aggregation.Aggregations
> Ideally implementation should be single pass and numerically stable.
> References:
> "Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
> al, International Conference on Data Engineering 2012
> http://dl.acm.org/citation.cfm?id=2310392
> "The Kahan summation algorithm (also known as compensated summation) reduces 
> the numerical errors that occur when adding a sequence of finite precision 
> floating point numbers. Numerical errors arise due to truncation and 
> rounding. These errors can lead to numerical instability when calculating 
> variance."
> https://en.wikipedia.org/wiki/Kahan_summation_algorithm



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3613) Add standard deviation, mean, variance to list of Aggregations

2016-03-14 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3613?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3613:
-
Description: 
Implement standard deviation, mean, variance for 
org.apache.flink.api.java.aggregation.Aggregations

Ideally implementation should be single pass and numerically stable.

References:

"Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
al, International Conference on Data Engineering 2012
http://dl.acm.org/citation.cfm?id=2310392

"The Kahan summation algorithm (also known as compensated summation) reduces 
the numerical errors that occur when adding a sequence of finite precision 
floating point numbers. Numerical errors arise due to truncation and rounding. 
These errors can lead to numerical instability when calculating variance."
https://en.wikipedia.org/wiki/Kahan_summation_algorithm


  was:
Implement Standard Deviation for 
org.apache.flink.api.java.aggregation.Aggregations

Ideally implementation should be single pass and numerically stable.

References:

"Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
al, International Conference on Data Engineering 2012
http://dl.acm.org/citation.cfm?id=2310392

"The Kahan summation algorithm (also known as compensated summation) reduces 
the numerical errors that occur when adding a sequence of finite precision 
floating point numbers. Numerical errors arise due to truncation and rounding. 
These errors can lead to numerical instability when calculating variance."
https://en.wikipedia.org/wiki/Kahan_summation_algorithm



> Add standard deviation, mean, variance to list of Aggregations
> --
>
> Key: FLINK-3613
> URL: https://issues.apache.org/jira/browse/FLINK-3613
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>Priority: Minor
>
> Implement standard deviation, mean, variance for 
> org.apache.flink.api.java.aggregation.Aggregations
> Ideally implementation should be single pass and numerically stable.
> References:
> "Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
> al, International Conference on Data Engineering 2012
> http://dl.acm.org/citation.cfm?id=2310392
> "The Kahan summation algorithm (also known as compensated summation) reduces 
> the numerical errors that occur when adding a sequence of finite precision 
> floating point numbers. Numerical errors arise due to truncation and 
> rounding. These errors can lead to numerical instability when calculating 
> variance."
> https://en.wikipedia.org/wiki/Kahan_summation_algorithm



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (FLINK-3613) Add standard deviation, mean, variance to list of Aggregations

2016-03-14 Thread Todd Lisonbee (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-3613?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Todd Lisonbee updated FLINK-3613:
-
Summary: Add standard deviation, mean, variance to list of Aggregations  
(was: Add standard deviation to list of Aggregations)

> Add standard deviation, mean, variance to list of Aggregations
> --
>
> Key: FLINK-3613
> URL: https://issues.apache.org/jira/browse/FLINK-3613
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>Priority: Minor
>
> Implement Standard Deviation for 
> org.apache.flink.api.java.aggregation.Aggregations
> Ideally implementation should be single pass and numerically stable.
> References:
> "Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
> al, International Conference on Data Engineering 2012
> http://dl.acm.org/citation.cfm?id=2310392
> "The Kahan summation algorithm (also known as compensated summation) reduces 
> the numerical errors that occur when adding a sequence of finite precision 
> floating point numbers. Numerical errors arise due to truncation and 
> rounding. These errors can lead to numerical instability when calculating 
> variance."
> https://en.wikipedia.org/wiki/Kahan_summation_algorithm



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3613) Add standard deviation to list of Aggregations

2016-03-14 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3613?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15193827#comment-15193827
 ] 

Todd Lisonbee commented on FLINK-3613:
--

I checked the Spark code base, it looks like they used the same technique 
described in links above,
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/StatCounter.scala

I'm going to expand this JIRA to also include adding the Mean and Variance to 
list of Aggregations.  There is code overlap for all three so it probably makes 
sense to solve together (like StatCounter.scala).

I noticed there is already an "Average" aggregation that is commented out 
(possibly because of numerical stability problems it would have).

I'll search JIRA for possible overlap.

> Add standard deviation to list of Aggregations
> --
>
> Key: FLINK-3613
> URL: https://issues.apache.org/jira/browse/FLINK-3613
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>Priority: Minor
>
> Implement Standard Deviation for 
> org.apache.flink.api.java.aggregation.Aggregations
> Ideally implementation should be single pass and numerically stable.
> References:
> "Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
> al, International Conference on Data Engineering 2012
> http://dl.acm.org/citation.cfm?id=2310392
> "The Kahan summation algorithm (also known as compensated summation) reduces 
> the numerical errors that occur when adding a sequence of finite precision 
> floating point numbers. Numerical errors arise due to truncation and 
> rounding. These errors can lead to numerical instability when calculating 
> variance."
> https://en.wikipedia.org/wiki/Kahan_summation_algorithm



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Commented] (FLINK-3613) Add standard deviation to list of Aggregations

2016-03-14 Thread Todd Lisonbee (JIRA)

[ 
https://issues.apache.org/jira/browse/FLINK-3613?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15193522#comment-15193522
 ] 

Todd Lisonbee commented on FLINK-3613:
--

Hello, I'm new to Apache Flink and would like to contribute some code.  A 
standard deviation aggregation seemed like an easy place to start.

I did a quick search and didn't see anyone already working on this.

A team mate of mine implemented something similar to what I believe is needed 
against Apache Spark here,
https://github.com/trustedanalytics/atk/blob/master/engine-plugins/frame-plugins/src/main/scala/org/trustedanalytics/atk/engine/frame/plugins/groupby/aggregators/VarianceAggregator.scala

I was going to write a fresh implementation for Flink - unless someone stops me.

Thanks!


> Add standard deviation to list of Aggregations
> --
>
> Key: FLINK-3613
> URL: https://issues.apache.org/jira/browse/FLINK-3613
> Project: Flink
>  Issue Type: Improvement
>Reporter: Todd Lisonbee
>Priority: Minor
>
> Implement Standard Deviation for 
> org.apache.flink.api.java.aggregation.Aggregations
> Ideally implementation should be single pass and numerically stable.
> References:
> "Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
> al, International Conference on Data Engineering 2012
> http://dl.acm.org/citation.cfm?id=2310392
> "The Kahan summation algorithm (also known as compensated summation) reduces 
> the numerical errors that occur when adding a sequence of finite precision 
> floating point numbers. Numerical errors arise due to truncation and 
> rounding. These errors can lead to numerical instability when calculating 
> variance."
> https://en.wikipedia.org/wiki/Kahan_summation_algorithm



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Created] (FLINK-3613) Add standard deviation to list of Aggregations

2016-03-14 Thread Todd Lisonbee (JIRA)
Todd Lisonbee created FLINK-3613:


 Summary: Add standard deviation to list of Aggregations
 Key: FLINK-3613
 URL: https://issues.apache.org/jira/browse/FLINK-3613
 Project: Flink
  Issue Type: Improvement
Reporter: Todd Lisonbee
Priority: Minor


Implement Standard Deviation for 
org.apache.flink.api.java.aggregation.Aggregations

Ideally implementation should be single pass and numerically stable.

References:

"Scalable and Numerically Stable Descriptive Statistics in SystemML", Tian et 
al, International Conference on Data Engineering 2012
http://dl.acm.org/citation.cfm?id=2310392

"The Kahan summation algorithm (also known as compensated summation) reduces 
the numerical errors that occur when adding a sequence of finite precision 
floating point numbers. Numerical errors arise due to truncation and rounding. 
These errors can lead to numerical instability when calculating variance."
https://en.wikipedia.org/wiki/Kahan_summation_algorithm




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)