Github user liancheng commented on the pull request:

    https://github.com/apache/spark/pull/8026#issuecomment-128643963
  
    A summary of my offline discussion with @chenghao-intel:
    
    The real problem here is that the partition column types of the newly 
refreshed partition spec don't match those in the user specified spec. The 
current fix simply disables refreshing partition spec, which is not preferable. 
My suggestion is to factor out the [partition values casting part] [1] in the 
`partitionSpec` method and reuse it in `refresh()` to cast data types of 
partition values and just reuse `partitionColumns` in the user specified 
partition spec.
    
    [1]: 
https://github.com/apache/spark/blob/ebfd91c542aaead343cb154277fcf9114382fee7/sql/core/src/main/scala/org/apache/spark/sql/sources/interfaces.scala#L460-L473


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to