Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/3564#issuecomment-66827106
The interleaved output makes this a little bit tricky to diagnose, but it
looks like the "launch simple application with spark-submit" test might have
failed due to port-binding issues:
```
14/12/12 01:49:16 ERROR SparkUI: Failed to bind SparkUI
java.net.BindException: Address already in use: Service 'SparkUI' failed
after 16 retries!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:444)
at sun.nio.ch.Net.bind(Net.java:436)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:194)
at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:204)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1675)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1666)
at
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:204)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
at
org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:269)
at
org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:269)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:269)
at
org.apache.spark.deploy.SimpleApplicationTest$.main(SparkSubmitSuite.scala:481)
at
org.apache.spark.deploy.SimpleApplicationTest.main(SparkSubmitSuite.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:365)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
```
An earlier PR (#2363) disabled the Spark UI in unit tests, but I guess
we're not passing that option in the call to `spark-submit`, which makes that
test prone to flakiness. The fix is probably as simple as passing the `--conf
spark.ui.enabled=false` option to `spark-submit` in that test.
Output interleaving made this test a bit hard to debug; part of the problem
was that stdout / stderr from the forked `spark-submit` process was dumped into
the main Jenkins log. We might be able to make these types of tests easier to
debug by capturing the forked `spark-submit` process's output and including it
in the exception / message if that test failed. This can be done using
ScalaTest fixtures. I can try setting this up and can send a PR to this PR
once I'm done.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]