Github user rdblue commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21122#discussion_r186152832
  
    --- Diff: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala 
---
    @@ -1354,7 +1354,8 @@ class HiveDDLSuite
         val indexName = tabName + "_index"
         withTable(tabName) {
           // Spark SQL does not support creating index. Thus, we have to use 
Hive client.
    -      val client = 
spark.sharedState.externalCatalog.asInstanceOf[HiveExternalCatalog].client
    +      val client =
    +        
spark.sharedState.externalCatalog.unwrapped.asInstanceOf[HiveExternalCatalog].client
    --- End diff --
    
    There are several places that access the `client`, like 
`HiveSessionStateBuilder` and `SaveAsHiveFile`. My point is that assuming the 
catalog is a `HiveExternalCatalog`, casting, and accessing a field of the 
catalog to get the client isn't a good way to pass the client to where it is 
needed. Why doesn't this code pass both `client` and `catalog` to these classes 
to avoid this problem?
    
    I don't think it matters that Hive may be removed in the future. Unless 
removing it is going to happen in a week or two, I'm skeptical that it is going 
to be soon enough that we should ignore problems with the implementation.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to