Github user travishegner commented on the pull request:
https://github.com/apache/spark/pull/8780#issuecomment-140936357
I'm not sure if oracle can be associated with anything *reasonable*, but
sometimes you have to play the hand you are dealt. :)
I can only answer your question with a question... Would there ever be a
use case in the Decimal() class where the precision and/or the scale would be
set to a negative value?
If not, then I'd have to imagine that this patch makes the intention of the
code more accurate across the board. If yes, then I may have to explore an
oracle-only type of patch, which may or may not ever be committed back,
depending on it's usefulness to the community.
I'd have to assume that there isn't a use case for negative values given
the way precision and scale are used and defined, but you'll have to forgive
any ignorance on my part as I'm still fairly new to scala. I hadn't even
browsed the source for spark until about one week ago. I'm still in the alpha
stages of even testing spark in general, so while it's seemingly solved the
problem for me in my testing, I could easily be overlooking something.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]