In my unit tests I have a base class that all my tests extend that has a setup and teardown method that they inherit. They look something like this:

    var spark: SparkContext = _

    @Before
    def setUp() {
Thread.sleep(100L) //this seems to give spark more time to reset from the previous test's tearDown
        spark = new SparkContext("local", "test spark")
    }

    @After
    def tearDown() {
        spark.stop
        spark = null //not sure why this helps but it does!
        System.clearProperty("spark.master.port")
   }


It's been since last fall (i.e. version 0.8.x) since I've examined this code and so I can't vouch that it is still accurate/necessary - but it still works for me.


On 06/18/2014 12:59 PM, Lisonbee, Todd wrote:

Disabling parallelExecution has worked for me.

Other alternatives I’ve tried that also work include:

1. Using a lock – this will let tests execute in parallel except for those using a SparkContext. If you have a large number of tests that could execute in parallel, this can shave off some time.

object TestingSparkContext {

val lock = new Lock()

}

// before you instantiate your local SparkContext

TestingSparkContext.lock.acquire()

// after you call sc.stop()

TestingSparkContext.lock.release()

2. Sharing a local SparkContext between tests.

- This is nice because your tests will run faster. Start-up and shutdown is time consuming (can add a few seconds per test).

- The downside is that your tests are using the same SparkContext so they are less independent of each other. I haven’t seen issues with this yet but there are likely some things that might crop up.

Best,

Todd

*From:*Anselme Vignon [mailto:anselme.vig...@flaminem.com]
*Sent:* Wednesday, June 18, 2014 12:33 AM
*To:* user@spark.apache.org
*Subject:* Re: Unit test failure: Address already in use

Hi,

Could your problem come from the fact that you run your tests in parallel ?

If you are spark in local mode, you cannot have concurrent spark instances running. this means that your tests instantiating sparkContext cannot be run in parallel. The easiest fix is to tell sbt to not run parallel tests.

This can be done by adding the following line in your build.sbt:

parallelExecution in Test := false

Cheers,

Anselme

2014-06-17 23:01 GMT+02:00 SK <skrishna...@gmail.com <mailto:skrishna...@gmail.com>>:

    Hi,

    I have 3 unit tests (independent of each other) in the /src/test/scala
    folder. When I run each of them individually using: sbt "test-only
    <test>",
    all the 3 pass the test. But when I run them all using "sbt test",
    then they
    fail with the warning below. I am wondering if the binding
    exception results
    in failure to run the job, thereby causing the failure. If so,
    what can I do
    to address this binding exception? I am running these tests
    locally on a
    standalone machine (i.e. SparkContext("local", "test")).


    14/06/17 13:42:48 WARN component.AbstractLifeCycle: FAILED
    org.eclipse.jetty.server.Server@3487b78d
    <mailto:org.eclipse.jetty.server.Server@3487b78d>:
    java.net.BindException: Address
    already in use
    java.net.BindException: Address already in use
            at sun.nio.ch.Net.bind0(Native Method)
            at sun.nio.ch.Net.bind(Net.java:174)
            at
    sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:139)
            at
    sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:77)


    thanks



    --
    View this message in context:
    
http://apache-spark-user-list.1001560.n3.nabble.com/Unit-test-failure-Address-already-in-use-tp7771.html
    Sent from the Apache Spark User List mailing list archive at
    Nabble.com.


Reply via email to