cxzl25 commented on PR #53193:
URL: https://github.com/apache/spark/pull/53193#issuecomment-3621949879
> We do a reset, which should have cleared this map - still an issue ?
The clear method will only set the value of the keyTable array to null, but
will not reset the keyTable, so the next put will trigger this problem.
com.esotericsoftware.kryo.util.IdentityObjectIntMap#clear(int)
```java
public void clear (int maximumCapacity) {
if (capacity <= maximumCapacity) {
clear();
return;
}
public void clear () {
K[] keyTable = this.keyTable;
for (int i = capacity + stashSize; i-- > 0;)
keyTable[i] = null;
size = 0;
stashSize = 0;
}
```
> I cant seem to reproduce it with the test you have provided
Because in `IdentityObjectIntMap`, it uses `System.identityHashCode` to
calculate hash, so different environments may get different results.
I modified the reproduced code. Maybe this is easier to reproduce.
```scala
import com.esotericsoftware.kryo.util._
val identityObjectIntMap = new IdentityObjectIntMap[String](1073741824, 0.8f)
identityObjectIntMap.put("k1", 1)
identityObjectIntMap.clear((1073741824) << 1)
identityObjectIntMap.clear(2048)
for (i <- 0 until 10000) {
val s = new String("k_" + i)
val i1 = System.identityHashCode(s) & 2147483647
if (i1 > 1073741866) {
println(s"Found one: $s, hash: $i1")
identityObjectIntMap.put(s, 1)
}
}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]