[
https://issues.apache.org/jira/browse/HADOOP-5255?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12675937#action_12675937
]
Hudson commented on HADOOP-5255:
--------------------------------
Integrated in Hadoop-trunk #763 (See
[http://hudson.zones.apache.org/hudson/job/Hadoop-trunk/763/])
> Fix for HADOOP-5079 HashFunction inadvertently destroys some randomness
> -----------------------------------------------------------------------
>
> Key: HADOOP-5255
> URL: https://issues.apache.org/jira/browse/HADOOP-5255
> Project: Hadoop Core
> Issue Type: Bug
> Components: io
> Affects Versions: 0.20.0, 0.21.0
> Reporter: stack
> Assignee: Jonathan Ellis
> Priority: Minor
> Fix For: 0.20.0
>
> Attachments: hadoop-core-hash-2-branch-0.20.patch
>
>
> HADOOP-5079 did this "HashFunction.hash restricts initval for the next hash
> to the [0, maxValue) range of the hash indexes returned. This is suboptimal,
> particularly for larger nbHash and smaller maxValue. Rather we should first
> set initval, then restrict the range for the result assignment." The patch
> committed on that issue introduced a new bug: "My first patch contained a
> regression: you have to take the remainder before calling Math.abs, since
> Math.abs(Integer.MIN_VALUE) == Integer.MIN_VALUE still" (Jonathan Ellis).
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.