GitHub user travishegner opened a pull request:

    https://github.com/apache/spark/pull/8780

    [SPARK-10648] Proposed bug fix when oracle returns -127 as a scale to a 
numeric type

    In my environment, the precision and scale are undefined in the oracle 
database, but spark is detecting them to be 0 and -127 respectively.
    
    If I understand those two values correctly, they should never logically be 
defined as less than zero, so the proposed changes should correctly default the 
precision and scale instead of trying to use erroneous values.
    
    If there is a valid use case of a negative precision or scale, then I can 
re-work this to test for the exact 0 AND -127 case and handle it appropriately.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/travishegner/spark master

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/8780.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #8780
    
----
commit b0c3be317d0b681e183543de97093120a51a6222
Author: Travis Hegner <[email protected]>
Date:   2015-09-16T20:12:23Z

    Proposed bug fix when oracle returns -127 as a scale to a numeric type

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to