dongjoon-hyun commented on a change in pull request #25753: [SPARK-29046][SQL]
Fix NPE in SQLConf.get when active SparkContext is stopping
URL: https://github.com/apache/spark/pull/25753#discussion_r324398889
##########
File path:
sql/core/src/test/scala/org/apache/spark/sql/internal/SQLConfSuite.scala
##########
@@ -320,4 +321,22 @@ class SQLConfSuite extends QueryTest with
SharedSparkSession {
assert(e2.getMessage.contains("spark.sql.shuffle.partitions"))
}
+ test("SPARK-29046: SQLConf.get shouldn't throw NPE when active SparkContext
is stopping") {
+ // Logically, there's only one case SQLConf.get throws NPE: there's active
SparkContext,
+ // but SparkContext is stopping - especially it sets dagScheduler as null.
+
+ val oldSparkContext = SparkContext.getActive
+ Utils.tryWithSafeFinally {
+ // this is necessary to set new SparkContext as active: it cleans up
active SparkContext
+ oldSparkContext.foreach(_ => SparkContext.clearActiveContext())
+
+ val conf = new SparkConf().setAppName("test").setMaster("local")
+ LocalSparkContext.withSpark(new SparkContext(conf)) { sc =>
+ sc.dagScheduler = null
+ SQLConf.get
+ }
+ } {
+ oldSparkContext.orElse(Some(null)).foreach(SparkContext.setActiveContext)
Review comment:
Hi, All.
Ur, I'm wondering if this PR's test case is safe in concurrent testing
environments?
This might be the root cause of the current outage in Apache Spark Jenkins
jobs.
After this PR, all Jenkins on `master` branch never succeeds.
-
https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/
Could you take a look a little bit?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]