szehon-ho opened a new pull request, #53199:
URL: https://github.com/apache/spark/pull/53199

   
   
   ### What changes were proposed in this pull request?
   Follow up of: https://github.com/apache/spark/pull/53149
   
   1. Make the update assignment by field the Spark 4.1 behavior.  For context, 
the case to allow assignment key and value to be different struct for MERGE 
INTO is new in Spark 4.1 so we have a chance to define the behavior.  In Spark, 
nested fields are usually treated as top level column so it should follow the 
behavior: see https://github.com/apache/spark/pull/53149#discussion_r2557262463
   2. Rename existing config to control the struct type compatibility check in 
assignment.  We do not need to mention 'source' as actually the assignment can 
be to anything, not necessarily to source table.
   
   
   
   ### Why are the changes needed?
   See above
   
   
   ### Does this PR introduce _any_ user-facing change?
   No, this feature is unreleased (allowing assignment source to be of 
different struct type as target)
   
   
   ### How was this patch tested?
   Existing unit test
   
   ### Was this patch authored or co-authored using generative AI tooling?
   No


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to