Kent Yao created SPARK-35102:
--------------------------------

             Summary: Make spark.sql.hive.version meaningful and not deprecated
                 Key: SPARK-35102
                 URL: https://issues.apache.org/jira/browse/SPARK-35102
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 3.2.0
            Reporter: Kent Yao


Firstly let's take a look the definition and comment.
{code:java}
// A fake config which is only here for backward compatibility reasons. This 
config has no effect
// to Spark, just for reporting the builtin Hive version of Spark to existing 
applications that
// already rely on this config.
val FAKE_HIVE_VERSION = buildConf("spark.sql.hive.version")
  .doc(s"deprecated, please use ${HIVE_METASTORE_VERSION.key} to get the Hive 
version in Spark.")
  .version("1.1.1")
  .fallbackConf(HIVE_METASTORE_VERSION)
{code}

It is used for reporting the built-in Hive version but the current status is 
unsatisfactory, as it is could be changed in many ways e.g. --conf/SET syntax.

It is marked as deprecated but kept a long way until now. I guess it is hard 
for us to remove it and not even necessary.

On second thought, it's actually good for us to keep it to work with the 
`spark.sql.hive.metastore.version`. As when `spark.sql.hive.metastore.version` 
is changed, it could just be used to report the compiled hive version 
statically, it's useful when an error occurs in this case. So this parameter 
should be fixed to compiled hive version.







--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to