Eugene-Mark commented on PR #36499:
URL: https://github.com/apache/spark/pull/36499#issuecomment-1146638887

   @srowen Thanks for your response. For first part, `indicate NUMBER with the 
system limits for precision and scale`, we didn't find more explanations about 
it. It sounds like the scale and precision is flexible depending on user's 
input, but can't be larger than system limit. Since it's flexible, maybe they 
just return scale as `0` to show the case. (I'm actually thinking maybe it's 
better to provides a invalid value, like `-1`, then for downstream caller like 
Spark can handle the case better. )
   Before it's fixed from Teradata side (or maybe never), the issue goes into 
which situation can be tolerated(in more cases):
   1. A number like 1234.5678 is rounded to 1234 (Current behavior)
   2. A number like 1234 is turned to 1234.0000
   IMHO, the second option seems more reasonable.
   
   As for "Can a caller work around it in this case with a cast or does that 
not work?". Yes, the cast can be a work around. However, it forces user to take 
care of the precision and scale of each Number column and it would be more 
tedious when query is complex with a lot columns to be taken care of. And it 
somehow goes against the flexibility of original Number(*) definition. 
   
   Anyway, I agree with you that there seems hard to find a "correct" answer, 
more like a tradeoff and also needs document to mark it out. 
   
   
   
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to