Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/18805
I haven't been able to reproduce the issue locally, but looking at the
jenkins logs I see a bunch of exceptions like these:
```
17/10/13 06:53:26.609 dispatcher-event-loop-15 ERROR Worker: Failed to
launch executor app-20171013030138-0000/3 for Test replay.
java.lang.IllegalStateException: Shutdown hooks cannot be modified during
shutdown.
at
org.apache.spark.util.SparkShutdownHookManager.add(ShutdownHookManager.scala:195)
```
And:
```
17/10/13 06:53:26.687
pool-1-thread-1-ScalaTest-running-ExternalAppendOnlyMapSuite WARN SparkContext:
Another SparkContext is being constructed (or threw an exception in its
constructor). This may indicate an error, since only one SparkContext may be
running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:127)
org.apache.spark.util.collection.ExternalAppendOnlyMapSuite$$anonfun$12.apply$mcV$sp(ExternalAppendOnlyMapSuite.scala:30
```
Note that the first error mentions the app name used by
`ReplayListenerSuite` but it actually happens in a completely separate test
suite. At the very least, `ReplayListenerSuite` is doing a poor job of cleaning
up after itself and we should fix that.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]