This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new f9776e3  [MINOR][SQL] Fix typo in exception about set table properties.
f9776e3 is described below

commit f9776e389215255dc61efaa2eddd92a1fa754b48
Author: gengjiaan <gengji...@360.cn>
AuthorDate: Thu Feb 21 22:13:47 2019 -0600

    [MINOR][SQL] Fix typo in exception about set table properties.
    
    ## What changes were proposed in this pull request?
    
    The function of the method named verifyTableProperties is
    
    `If the given table properties contains datasource properties, throw an 
exception. We will do this check when create or alter a table, i.e. when we try 
to write table metadata to Hive metastore.`
    
    But the message of AnalysisException in verifyTableProperties contains one 
typo and one unsuited word.
    So I change the exception from
    
    `Cannot persistent ${table.qualifiedName} into hive metastore`
    
    to
    
    `Cannot persist ${table.qualifiedName} into Hive metastore`
    
    ## How was this patch tested?
    
    Please review http://spark.apache.org/contributing.html before opening a 
pull request.
    
    Closes #23574 from beliefer/incorrect-analysis-exception.
    
    Authored-by: gengjiaan <gengji...@360.cn>
    Signed-off-by: Sean Owen <sean.o...@databricks.com>
---
 .../src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala  | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
index c1178ad..11a2192 100644
--- 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
+++ 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
@@ -128,7 +128,7 @@ private[spark] class HiveExternalCatalog(conf: SparkConf, 
hadoopConf: Configurat
   private def verifyTableProperties(table: CatalogTable): Unit = {
     val invalidKeys = 
table.properties.keys.filter(_.startsWith(SPARK_SQL_PREFIX))
     if (invalidKeys.nonEmpty) {
-      throw new AnalysisException(s"Cannot persistent ${table.qualifiedName} 
into hive metastore " +
+      throw new AnalysisException(s"Cannot persist ${table.qualifiedName} into 
Hive metastore " +
         s"as table property keys may not start with '$SPARK_SQL_PREFIX': " +
         invalidKeys.mkString("[", ", ", "]"))
     }


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to