harshmotw-db commented on PR #47808:
URL: https://github.com/apache/spark/pull/47808#issuecomment-2301342765

   @MaxGekk Thanks for the response. I don't really know about any use cases as 
I am not a data scientist. I was thinking about this from a coverage 
perspective where 19-digit `interval second`'s cannot be cast to decimals of 
any scale/precision and there is no reason as to why they shouldn't be. 
999999999999.999999 (18 digits) can be cast to decimal but 1000000000000.000000 
(19 digits) cannot and I can't think of a good reason for why we want this to 
be the case.
   
   Also, if we want to prohibit this for some reason, the current error message 
does not tell the user that this functionality is prohibited. The message says 
`0 cannot be represented as Decimal(18, 6).` which wouldn't make sense to the 
user if `Decimal(18, 6)` isn't even a part of their query.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to