Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20564#discussion_r167392724
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
    @@ -1107,11 +1107,6 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
           }
         }
     
    -    // Note: Before altering table partitions in Hive, you *must* set the 
current database
    -    // to the one that contains the table of interest. Otherwise you will 
end up with the
    -    // most helpful error message ever: "Unable to alter partition. alter 
is not possible."
    -    // See HIVE-2742 for more detail.
    -    client.setCurrentDatabase(db)
    --- End diff --
    
    This does not change the behavior. Just to restore the original current 
database in the hive client. I do not think the current implementation has any 
issue, because we do not rely on the current database in the hive client. 
However, I am also fine to restore the original. This change does not have any 
harm but just make the codes more stable for the future refactoring.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to