cfmcgrady commented on code in PR #5039:
URL: https://github.com/apache/kyuubi/pull/5039#discussion_r1271660951


##########
externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/KyuubiSparkUtil.scala:
##########
@@ -95,7 +95,5 @@ object KyuubiSparkUtil extends Logging {
     }
   }
 
-  // Given that we are on the Spark SQL engine side, the 
[[org.apache.spark.SPARK_VERSION]] can be
-  // represented as the runtime version of the Spark SQL engine.
-  lazy val SPARK_ENGINE_RUNTIME_VERSION = 
SemanticVersion(org.apache.spark.SPARK_VERSION)
+  lazy val sparkRuntimeVersion: SemanticVersion = 
SemanticVersion(SPARK_VERSION)

Review Comment:
   > I'm generalizing Spark's runtime semantic version in sparkRuntimeVersion 
as the same as in other engines and modules.
   
   As stated in the code comment, the [[org.apache.spark.SPARK_VERSION]] 
denotes the runtime version of the Spark SQL engine when the suite is executed 
on the Spark SQL engine side. However, when executing the suite on the client 
side, it is not feasible to represent the runtime version using 
[[org.apache.spark.SPARK_VERSION]], as shown in 
https://github.com/apache/kyuubi/pull/4381
   
   > Here I prefer the import way rather than the full name inline, which still 
loads Spark version when needed.
   
   I have no opinion on whether to use the `import` style, but I prefer 
uppercase for constants.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to