[ 
https://issues.apache.org/jira/browse/SPARK-8109?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Reynold Xin updated SPARK-8109:
-------------------------------
    Issue Type: Sub-task  (was: Improvement)
        Parent: SPARK-8113

> TestSQLContext's static initialization is run during MiMa tests, causing 
> SparkContexts to be created
> ----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-8109
>                 URL: https://issues.apache.org/jira/browse/SPARK-8109
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL, Tests
>            Reporter: Josh Rosen
>
> Check out this stacktrace which occurred during MiMa tests in the pull 
> request builder:
> {code}
> java.net.BindException: Address already in use
>       at sun.nio.ch.Net.bind0(Native Method)
>       at sun.nio.ch.Net.bind(Net.java:444)
>       at sun.nio.ch.Net.bind(Net.java:436)
>       at 
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
>       at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>       at 
> org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
>       at 
> org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
>       at 
> org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
>       at 
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>       at org.eclipse.jetty.server.Server.doStart(Server.java:293)
>       at 
> org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
>       at 
> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:228)
>       at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:238)
>       at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:238)
>       at 
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
>       at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>       at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
>       at 
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:238)
>       at org.apache.spark.ui.WebUI.bind(WebUI.scala:117)
>       at 
> org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:448)
>       at 
> org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:448)
>       at scala.Option.foreach(Option.scala:236)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:448)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:135)
>       at 
> org.apache.spark.sql.test.LocalSQLContext.<init>(TestSQLContext.scala:29)
>       at 
> org.apache.spark.sql.test.TestSQLContext$.<init>(TestSQLContext.scala:55)
>       at 
> org.apache.spark.sql.test.TestSQLContext$.<clinit>(TestSQLContext.scala)
>       at java.lang.Class.forName0(Native Method)
>       at java.lang.Class.forName(Class.java:274)
>       at 
> scala.reflect.runtime.JavaMirrors$JavaMirror.javaClass(JavaMirrors.scala:500)
>       at 
> scala.reflect.runtime.JavaMirrors$JavaMirror.tryJavaClass(JavaMirrors.scala:505)
>       at 
> scala.reflect.runtime.SymbolLoaders$PackageScope.lookupEntry(SymbolLoaders.scala:109)
>       at scala.reflect.internal.Types$Type.findMember(Types.scala:1185)
>       at scala.reflect.internal.Types$Type.memberBasedOnName(Types.scala:722)
>       at scala.reflect.internal.Types$Type.member(Types.scala:680)
>       at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:43)
>       at 
> scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
>       at 
> scala.reflect.internal.Mirrors$RootsBase.staticModuleOrClass(Mirrors.scala:72)
>       at 
> scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:161)
>       at 
> scala.reflect.internal.Mirrors$RootsBase.staticModule(Mirrors.scala:21)
>       at 
> org.apache.spark.tools.GenerateMIMAIgnore$$anonfun$privateWithin$1.apply(GenerateMIMAIgnore.scala:72)
>       at 
> org.apache.spark.tools.GenerateMIMAIgnore$$anonfun$privateWithin$1.apply(GenerateMIMAIgnore.scala:69)
>       at 
> scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:153)
>       at 
> scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:306)
>       at 
> scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:306)
>       at 
> scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:306)
>       at 
> org.apache.spark.tools.GenerateMIMAIgnore$.privateWithin(GenerateMIMAIgnore.scala:69)
>       at 
> org.apache.spark.tools.GenerateMIMAIgnore$.main(GenerateMIMAIgnore.scala:126)
>       at 
> org.apache.spark.tools.GenerateMIMAIgnore.main(GenerateMIMAIgnore.scala)
> {code}
> Here, TestSQLContext's static initialization code is being run during MiMa 
> checks and that initialization creates a SparkContext.  Because MiMa doesn't 
> run with our test system properties, the UI tries to bind to a contended 
> port.  This may lead to flakiness.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to