HeartSaVioR opened a new pull request #35731:
URL: https://github.com/apache/spark/pull/35731


   ### What changes were proposed in this pull request?
   
   This PR fixes the StateSchemaCompatibilityChecker which mistakenly swapped 
`from` (should be provided schema) and `to` (should be existing schema).
   
   ### Why are the changes needed?
   
   The bug mistakenly allows the case where it should not be allowed, and 
disallows the case where it should be allowed.
   
   That allows nullable column to be stored into non-nullable column, which 
should be prohibited. This is less likely making runtime problem since state 
schema is conceptual one and row can be stored even not respecting the state 
schema.
   
   The opposite case is worse, that disallows non-nullable column to be stored 
into nullable column, which should be allowed. Spark fails the query for this 
case.
   
   ### Does this PR introduce _any_ user-facing change?
   
   Yes, after the fix, storing non-nullable column into nullable column for 
state will be allowed, which should have been allowed.
   
   ### How was this patch tested?
   
   Modified UTs.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to