Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/6159#issuecomment-103236638
  
    I'm adding a test for the overflow and noticed something interesting: when 
we overflow to `-2147483648` and pass that value to allocate, this case ends up 
being handled in `Math.max` in `allocate()`, so we end up allocating an array 
of size 64, which should end up causing problems in rehashing.  It looks like 
the `Math.max` was a carryover from Reynold's original LongToLong map: 
https://github.com/rxin/jvm-unsafe-utils/blame/master/core/src/main/java/com/databricks/unsafe/util/LongToLongMap.java#L184.
  I'll strengthen the internal assertions to catch this.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to