rohitrastogi commented on code in PR #399:
URL: https://github.com/apache/datafusion-comet/pull/399#discussion_r1593107084


##########
core/src/execution/datafusion/expressions/cast.rs:
##########
@@ -232,6 +232,189 @@ macro_rules! cast_int_to_int_macro {
     }};
 }
 
+// When Spark casts to Byte/Short Types, it does not cast directly to 
Byte/Short.

Review Comment:
   The float and decimal macros have a similar shape, so I considered 
consolidating them. One way to consolidate them is to generalize the macros to 
accept closures to handle the differences between the decimal and float casts. 
I think the generalized macro would have to accept closures for the following 
functional differences:
   1) determining whether there is an overflow
   2) generating the appropriate overflow error
   3) transforming the downcasted value (truncating, casting) before entering 
the match statement
   
   I tried this, but the code seemed more opaque and verbose in 
spark_cast_nonintegral_numeric_to_integral() than the current approach.
   
   Please let me know if you see other ways to refactor to make this more 
concise.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to