GitHub user ash211 opened a pull request:

    https://github.com/apache/spark/pull/1019

    SPARK-1902 Silence stacktrace from logs when doing port failover to port n+1

    Before:
    
    ```
    14/06/08 23:58:23 WARN AbstractLifeCycle: FAILED 
[email protected]:4040: java.net.BindException: Address already in 
use
    java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:444)
        at sun.nio.ch.Net.bind(Net.java:436)
        at 
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at 
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at 
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at 
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at 
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.eclipse.jetty.server.Server.doStart(Server.java:293)
        at 
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at 
org.apache.spark.ui.JettyUtils$$anonfun$1.apply$mcV$sp(JettyUtils.scala:192)
        at org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
        at org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
        at scala.util.Try$.apply(Try.scala:161)
        at org.apache.spark.ui.JettyUtils$.connect$1(JettyUtils.scala:191)
        at 
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:205)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:99)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
        at 
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957)
        at $line3.$read$$iwC$$iwC.<init>(<console>:8)
        at $line3.$read$$iwC.<init>(<console>:14)
        at $line3.$read.<init>(<console>:16)
        at $line3.$read$.<init>(<console>:20)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.<init>(<console>:7)
        at $line3.$eval$.<clinit>(<console>)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
        at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
        at 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
        at 
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
        at 
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
        at 
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
        at 
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
        at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
        at 
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
        at 
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
        at 
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
        at 
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at 
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    14/06/08 23:58:23 WARN AbstractLifeCycle: FAILED 
org.eclipse.jetty.server.Server@7439e55a: java.net.BindException: Address 
already in use
    java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:444)
        at sun.nio.ch.Net.bind(Net.java:436)
        at 
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at 
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at 
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at 
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at 
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.eclipse.jetty.server.Server.doStart(Server.java:293)
        at 
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at 
org.apache.spark.ui.JettyUtils$$anonfun$1.apply$mcV$sp(JettyUtils.scala:192)
        at org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
        at org.apache.spark.ui.JettyUtils$$anonfun$1.apply(JettyUtils.scala:192)
        at scala.util.Try$.apply(Try.scala:161)
        at org.apache.spark.ui.JettyUtils$.connect$1(JettyUtils.scala:191)
        at 
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:205)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:99)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:223)
        at 
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:957)
        at $line3.$read$$iwC$$iwC.<init>(<console>:8)
        at $line3.$read$$iwC.<init>(<console>:14)
        at $line3.$read.<init>(<console>:16)
        at $line3.$read$.<init>(<console>:20)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.<init>(<console>:7)
        at $line3.$eval$.<clinit>(<console>)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:788)
        at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1056)
        at 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:614)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:645)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:609)
        at 
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:796)
        at 
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:841)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:753)
        at 
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:121)
        at 
org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:120)
        at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:263)
        at 
org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:120)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:56)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:913)
        at 
org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:142)
        at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:56)
        at 
org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:104)
        at 
org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:56)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:930)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at 
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:884)
        at 
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:884)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:982)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    14/06/08 23:58:23 INFO JettyUtils: Failed to create UI at port, 4040. 
Trying again.
    14/06/08 23:58:23 INFO JettyUtils: Error was: 
Failure(java.net.BindException: Address already in use)
    14/06/08 23:58:23 INFO SparkUI: Started SparkUI at 
http://aash-mbp.local:4041
    ````
    
    After:
    ```
    14/06/09 00:04:12 INFO JettyUtils: Failed to create UI at port, 4040. 
Trying again.
    14/06/09 00:04:12 INFO JettyUtils: Error was: 
Failure(java.net.BindException: Address already in use)
    14/06/09 00:04:12 INFO Server: jetty-8.y.z-SNAPSHOT
    14/06/09 00:04:12 INFO AbstractConnector: Started 
[email protected]:4041
    14/06/09 00:04:12 INFO SparkUI: Started SparkUI at 
http://aash-mbp.local:4041
    ```
    
    Lengthy logging comes from this line of code in Jetty: 
http://grepcode.com/file/repo1.maven.org/maven2/org.eclipse.jetty.aggregate/jetty-all/9.1.3.v20140225/org/eclipse/jetty/util/component/AbstractLifeCycle.java#210

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/ash211/spark SPARK-1902

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/1019.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1019
    
----
commit 9d85eedf26fe6c0f8c98c6f90c8e07de00dc4e8e
Author: Andrew Ash <[email protected]>
Date:   2014-06-09T06:59:53Z

    SPARK-1902 Silence stacktrace from logs when doing port failover to port n+1

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to