beliefer commented on a change in pull request #23574: [SPARK-26643][SQL] Fix 
incorrect analysis exception about set table properties.
URL: https://github.com/apache/spark/pull/23574#discussion_r249724139
 
 

 ##########
 File path: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
 ##########
 @@ -129,7 +129,7 @@ private[spark] class HiveExternalCatalog(conf: SparkConf, 
hadoopConf: Configurat
     val invalidKeys = 
table.properties.keys.filter(_.startsWith(SPARK_SQL_PREFIX))
     if (invalidKeys.nonEmpty) {
       throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} 
into hive metastore " +
-        s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
+        s"as table property keys may start with '$SPARK_SQL_PREFIX': " +
 
 Review comment:
   I maintain a spark have a lot of custom function. My spark doesn't treat the 
table properties correctly lead to the analysis exception.
   I have test the spark2.3 and spark2.4,this analysis exception not appear.
   So we should back to the beginning, to discuss the incorrect analysis 
exception.
   There exists a detail is the analysis exception contains a word 'as', the 
meaning of 'as' is because. so I still think `as table property keys may start 
with '$SPARK_SQL_PREFIX'` is better.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to