Pedantic note about hashCode and equals: the equality doesn't need to be 
bidirectional, you just need to ensure that a.hashCode == b.hashCode when 
a.equals(b), the bidirectional case is usually harder to satisfy due to 
possibility of collisions.

Good info: 
http://www.programcreek.com/2011/07/java-equals-and-hashcode-contract/
                _____________________________
From: Jakob Odersky <ja...@odersky.com>
Sent: Wednesday, September 21, 2016 15:12
Subject: Re: What's the use of RangePartitioner.hashCode
To: WangJianfei <wangjianfe...@otcaix.iscas.ac.cn>
Cc: dev <dev@spark.apache.org>


Hi,
It is used jointly with a custom implementation of the `equals`
method. In Scala, you can override the `equals` method to change the
behaviour of `==` comparison. On example of this would be to compare
classes based on their parameter values (i.e. what case classes do).
Partitioners aren't case classes however it makes sense to have a
value comparison between them (see RDD.subtract for an example) and
hence they redefine the equals method.
When redefining an equals method, it is good practice to also redefine
the hashCode method so that `a == b` iff `a.hashCode == b.hashCode`
(e.g. this is useful when your objects will be stored in a hash map).
You can learn more about redefining the equals method and hashcodes
here 
https://www.safaribooksonline.com/library/view/scala-cookbook/9781449340292/ch04s16.html


regards,
--Jakob

On Thu, Sep 15, 2016 at 6:17 PM, WangJianfei
<wangjianfe...@otcaix.iscas.ac.cn> wrote:
> who can give me an example of the use of RangePartitioner.hashCode, thank
> you!
>
>
>
> --
> View this message in context: 
> http://apache-spark-developers-list.1001551.n3.nabble.com/What-s-the-use-of-RangePartitioner-hashCode-tp18953.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org




        

Reply via email to