turboFei commented on code in PR #41838:
URL: https://github.com/apache/spark/pull/41838#discussion_r1891226592


##########
sql/api/src/main/scala/org/apache/spark/sql/SqlApiConf.scala:
##########
@@ -0,0 +1,63 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.spark.sql
+
+import java.util.concurrent.atomic.AtomicReference
+
+import scala.util.Try
+
+import org.apache.spark.util.SparkClassUtils
+
+/**
+ * Configuration for all objects that are placed in the `sql/api` project. The 
normal way of
+ * accessing this class is through `SqlApiConf.get`. If this code is being 
used with sql/core
+ * then its values are bound to the currently set SQLConf. With Spark Connect, 
it will default to
+ * hardcoded values.
+ */
+private[sql] trait SqlApiConf {
+  def ansiEnabled: Boolean
+  def caseSensitiveAnalysis: Boolean
+  def maxToStringFields: Int
+}
+
+private[sql] object SqlApiConf {
+  /**
+   * Defines a getter that returns the [[SqlApiConf]] within scope.
+   */
+  private val confGetter = new AtomicReference[() => SqlApiConf](() => 
DefaultSqlApiConf)
+
+  /**
+   * Sets the active config getter.
+   */
+  private[sql] def setConfGetter(getter: () => SqlApiConf): Unit = {
+    confGetter.set(getter)
+  }
+
+  def get: SqlApiConf = confGetter.get()()
+
+  // Force load SQLConf. This will trigger the installation of a confGetter 
that points to SQLConf.
+  Try(SparkClassUtils.classForName("org.apache.spark.sql.internal.SQLConf$"))

Review Comment:
   Hi @hvanhovell We meet dead lock issue seems related with this patch.
   
   I have raised ticket https://issues.apache.org/jira/browse/SPARK-50620 and 
provide the thread dump.
   
   ```
   "Executor 92 task launch worker for task 2910, task 69.0 in stage 7.0 (TID 
2910)" #152 daemon prio=5 os_prio=0 cpu=616.25ms elapsed=258408.34s 
tid=0x00007f77d67cc330 nid=0x22c9e in Object.wait()  [0x00007f77755fb000]
      java.lang.Thread.State: RUNNABLE
        at org.apache.spark.sql.internal.SQLConf$.<init>(SQLConf.scala:184)
        - waiting on the Class initialization monitor for 
org.apache.spark.sql.internal.SqlApiConf$
        at org.apache.spark.sql.internal.SQLConf$.<clinit>(SQLConf.scala)
   ```
   
   ```
   "Executor 92 task launch worker for task 5362, task 521.0 in stage 10.0 (TID 
5362)" #123 daemon prio=5 os_prio=0 cpu=2443.78ms elapsed=258409.29s 
tid=0x00007f77d60ecad0 nid=0x22c7c in Object.wait()  [0x00007f777e591000]
      java.lang.Thread.State: RUNNABLE
        at java.lang.Class.forName0([email protected]/Native Method)
        - waiting on the Class initialization monitor for 
org.apache.spark.sql.internal.SQLConf$
        at java.lang.Class.forName([email protected]/Class.java:467)
        at 
org.apache.spark.util.SparkClassUtils.classForName(SparkClassUtils.scala:41)
        at 
org.apache.spark.util.SparkClassUtils.classForName$(SparkClassUtils.scala:36)
        at 
org.apache.spark.util.SparkClassUtils$.classForName(SparkClassUtils.scala:83)
        at 
org.apache.spark.sql.internal.SqlApiConf$.$anonfun$new$1(SqlApiConf.scala:73)
   ```
   
   For the first task: `at 
org.apache.spark.sql.internal.SQLConf$.<clinit>(SQLConf.scala)`  `- waiting on 
the Class initialization monitor for org.apache.spark.sql.internal.SqlApiConf$`.
   
   For the second one: `at 
org.apache.spark.sql.internal.SqlApiConf$.$anonfun$new$1(SqlApiConf.scala:73)` 
`- waiting on the Class initialization monitor for 
org.apache.spark.sql.internal.SQLConf$`
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to