beliefer commented on a change in pull request #23574: [SPARK-26643][SQL] Fix
incorrect analysis exception about set table properties.
URL: https://github.com/apache/spark/pull/23574#discussion_r248906962
##########
File path:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
##########
@@ -129,7 +129,7 @@ private[spark] class HiveExternalCatalog(conf: SparkConf,
hadoopConf: Configurat
val invalidKeys =
table.properties.keys.filter(_.startsWith(SPARK_SQL_PREFIX))
if (invalidKeys.nonEmpty) {
throw new AnalysisException(s"Cannot persistent ${table.qualifiedName}
into hive metastore " +
- s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
+ s"as table property keys may start with '$SPARK_SQL_PREFIX': " +
Review comment:
But when I execute a DDL in spark-sql, still throw the analysis exception.
`spark-sql> ALTER TABLE gja_test3 SET TBLPROPERTIES ('test' = 'test');
org.apache.spark.sql.AnalysisException: Cannot persistent work.gja_test3
into hive metastore as table property keys may not start with 'spark.sql.':
[spark.sql.partitionProvider];
at
org.apache.spark.sql.hive.HiveExternalCatalog.verifyTableProperties(HiveExternalCatalog.scala:129)`
As you see, this ddl contains a user-specified table property key not start
with '$SPARK_SQL_PREFIX'.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]