Re: HashMap bug for large sizes

2012-06-01 Thread David Holmes
On 1/06/2012 11:36 PM, Doug Lea wrote: On 06/01/12 05:29, Kasper Nielsen wrote: I don't know if this has been discussed before. But I was looking at the HashMap implementation today and noticed that there are some issues with very large sized hashmaps with more then Integer.MAX_VALUE elements.

Re: HashMap bug for large sizes

2012-06-01 Thread Kasper Nielsen
On 01-06-2012 22:05, Eamonn McManus wrote: But it is not just serializing a HashMap that does not work. HashMap.size() and HashMap.clear() isn't working as well. I don't see what's wrong with HashMap.clear(), My mistake, was looking at the HashMap implementation for Harmony. but HashMap.size(

Re: HashMap bug for large sizes

2012-06-01 Thread Eamonn McManus
> But it is not just serializing a HashMap that does not work. HashMap.size() > and HashMap.clear() isn't working as well. I don't see what's wrong with HashMap.clear(), but HashMap.size() is clearly buggy and should be fixed. There's also a performance problem in that accesses start becoming line

Re: HashMap bug for large sizes

2012-06-01 Thread Kasper Nielsen
On 01-06-2012 21:12, Eamonn McManus wrote: It seems to me that since the serialization of HashMaps with more than Integer.MAX_VALUE entries produces an output that cannot be deserialized, nobody can be using it, and we are free to change it. For example we could say that if the read size is -1 th

Re: HashMap bug for large sizes

2012-06-01 Thread Eamonn McManus
It seems to me that since the serialization of HashMaps with more than Integer.MAX_VALUE entries produces an output that cannot be deserialized, nobody can be using it, and we are free to change it. For example we could say that if the read size is -1 then the next item in the stream is a long that

Re: HashMap bug for large sizes

2012-06-01 Thread Doug Lea
On 06/01/12 05:29, Kasper Nielsen wrote: Hi, I don't know if this has been discussed before. But I was looking at the HashMap implementation today and noticed that there are some issues with very large sized hashmaps with more then Integer.MAX_VALUE elements. I think this arose on this list (o

HashMap bug for large sizes

2012-06-01 Thread Kasper Nielsen
Hi, I don't know if this has been discussed before. But I was looking at the HashMap implementation today and noticed that there are some issues with very large sized hashmaps with more then Integer.MAX_VALUE elements. 1. The Map contract says that "If the map contains more than Integer.MAX_VALUE

Re: review request: 4244896: (process) Provide System.getPid(), System.killProcess(String pid)

2012-06-01 Thread Alan Bateman
On 31/05/2012 17:48, Rob McKenna wrote: That link should be: http://cr.openjdk.java.net/~robm/4244896/webrev.04/ -Rob I'm happy the spec has come good too. One small suggestion is to tweak this line: "{@code Process} objects returned by {@link ProcessBuilder#start} and {@link Runtime#ex

hg: jdk8/tl/jdk: 7173432: Handle null key at HashMap resize

2012-06-01 Thread mike . duigou
Changeset: 7baa22e6a6b3 Author:mduigou Date: 2012-06-01 00:05 -0700 URL: http://hg.openjdk.java.net/jdk8/tl/jdk/rev/7baa22e6a6b3 7173432: Handle null key at HashMap resize Summary: If the key to be inserted into a HashMap is null and the table needs to be resized as part of the ins