gengliangwang opened a new pull request #25453: [SPARK-28730][SQL] Configurable 
type coercion policy for table insertion
URL: https://github.com/apache/spark/pull/25453
 
 
   ## What changes were proposed in this pull request?
   
   After all the discussions in the dev list: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Discuss-Follow-ANSI-SQL-on-table-insertion-td27531.html#a27562.
 
   Here I propose that we can make the store assignment rules in the analyzer 
configurable, and the behavior of V1 and V2 should be consistent.
   When inserting a value into a column with a different data type, Spark will 
perform type coercion. After this PR, we support 2 policies for the type 
coercion rules: 
   legacy and strict. 
   1. With legacy policy, Spark allows casting any value to any data type and 
null result is returned when the conversion is invalid. The legacy policy is 
the only behavior in Spark 2.x and it is compatible with Hive. 
   2. With strict policy, Spark doesn't allow any possible precision loss or 
data truncation in type coercion, e.g. `int` and `long`, `float` -> `double` 
are not allowed.
   
   To ensure backward compatibility with existing queries, the default store 
assignment policy is "legacy".
   ## How was this patch tested?
   
   Unit test
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to