Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18849#discussion_r132060765
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala 
---
    @@ -908,7 +909,13 @@ private[hive] object HiveClientImpl {
         }
         // after SPARK-19279, it is not allowed to create a hive table with an 
empty schema,
         // so here we should not add a default col schema
    -    if (schema.isEmpty && DDLUtils.isDatasourceTable(table)) {
    +    //
    +    // Because HiveExternalCatalog sometimes writes back "raw" tables that 
have not been
    +    // completely translated to Spark's view, the provider information 
needs to be looked
    +    // up in two places.
    +    val provider = table.provider.orElse(
    --- End diff --
    
    If you look at the bug, there are two exceptions. One gets logged, the 
second is thrown and caused the test to fail in my 2.1-based branch.
    
    The exception happened because `alterTableSchema` is writing back the 
result of `getRawTable`. That raw table does not have the provider set; 
instead, it's in the table's properties. This check looks at both places, so 
that other code that uses `getRawTable` can properly pass this check.
    
    As I explained in a previous comment, this doesn't happen anymore for 
`alterTableSchema` because of the other changes. But there's still code in the 
catalog class that writes back tables fetched with `getRawTable`, so this feels 
safer.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to