Github user budde commented on the issue:

    https://github.com/apache/spark/pull/16944
  
    I think I'm a little unclear still on what exact components you are using-- 
you're using Spark SQL via ```spark-shell``` to create the table, then using 
Hive to alter it, then querying the table again via ```spark-shell```? Is this 
an external metastore or one managed by Spark locally?
    
    If the table is created with Spark, then Spark should be storing the schema 
under the table properties at that point. This was the behavior prior to this 
change as well. If the table schema is changed by another application that does 
not alter the schema in the table properties then Spark SQL the behavior 
*should* be the same as it was in 2.1.0. 
    
    The fact that you got the exception at all though seems to indicate that 
the schema wasn't properly read from the table properties or that some state 
wasn't managed properly. I'm going to try to recreate this locally to figure 
out what is happening here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to