On 25/03/15 00:05, Martin Buchholz wrote:
How about this, which resurrects the jdk7 doc strings for the legacy fields?
That is fine with me, and similar to what I had at one point. I just
wasn't sure if you wanted to stuff the descriptions back in. If they are
still valid, then it is the best solution.
[ I think these were inadvertently dropped with the CHMv8 work. ]
-Chris.
--- src/main/java/util/concurrent/ConcurrentHashMap.java24 Mar 2015
22:30:53 -00001.270
+++ src/main/java/util/concurrent/ConcurrentHashMap.java25 Mar 2015
00:03:43 -0000
@@ -566,7 +566,16 @@
/** Number of CPUS, to place bounds on some sizings */
static final int NCPU = Runtime.getRuntime().availableProcessors();
- /** For serialization compatibility. */
+ /**
+ * Serialized pseudo-fields, provided only for jdk7 compatibility.
+ * @serialField segments Segment[]
+ * The segments, each of which is a specialized hash table.
+ * @serialField segmentMask int
+ * Mask value for indexing into segments. The upper bits of a
+ * key's hash code are used to choose the segment.
+ * @serialField segmentShift int
+ * Shift value for indexing within segments.
+ */
private static final ObjectStreamField[] serialPersistentFields = {
new ObjectStreamField("segments", Segment[].class),
new ObjectStreamField("segmentMask", Integer.TYPE),
On Tue, Mar 24, 2015 at 2:38 AM, Chris Hegarty <[email protected]
<mailto:[email protected]>> wrote:
Martin,
On 23 Mar 2015, at 22:15, Martin Buchholz <[email protected]
<mailto:[email protected]>> wrote:
> Let us know if the serialization code of the collection classes can be
> improved.
I think there are a few minor cleanups that would be beneficial in CHM:
1) Add @serialField so that the serializable stream fields show up
in the
javadoc ( serial form ), since they are still part of the
serial form, even
though not used in the implementation. This is just documenting
existing behaviour/form.
2) Mark correctly identified a small hole in the putFields() spec,
which
should be fixed. A minor, benign, change in CHM writeObject
can avoid this ( spec ambiguity ).
Please consider the following change:
diff --git
a/src/java.base/share/classes/java/util/concurrent/ConcurrentHashMap.java
b/src/java.base/share/classes/java/util/concurrent/ConcurrentHashMap.java
---
a/src/java.base/share/classes/java/util/concurrent/ConcurrentHashMap.java
+++
b/src/java.base/share/classes/java/util/concurrent/ConcurrentHashMap.java
@@ -35,6 +35,7 @@
package java.util.concurrent;
+import java.io.ObjectOutputStream;
import java.io.ObjectStreamField;
import java.io.Serializable;
import java.lang.reflect.ParameterizedType;
@@ -599,7 +600,12 @@
/** Number of CPUS, to place bounds on some sizings */
static final int NCPU =
Runtime.getRuntime().availableProcessors();
- /** For serialization compatibility. */
+ /**
+ * For serialization compatibility.
+ * @serialField segments Segment[]
+ * @serialField segmentMask int
+ * @serialField segmentShift int
+ */
private static final ObjectStreamField[]
serialPersistentFields = {
new ObjectStreamField("segments", Segment[].class),
new ObjectStreamField("segmentMask", Integer.TYPE),
@@ -1400,9 +1406,10 @@
new Segment<?,?>[DEFAULT_CONCURRENCY_LEVEL];
for (int i = 0; i < segments.length; ++i)
segments[i] = new Segment<K,V>(LOAD_FACTOR);
- s.putFields().put("segments", segments);
- s.putFields().put("segmentShift", segmentShift);
- s.putFields().put("segmentMask", segmentMask);
+ ObjectOutputStream.PutField streamFields = s.putFields();
+ streamFields.put("segments", segments);
+ streamFields.put("segmentShift", segmentShift);
+ streamFields.put("segmentMask", segmentMask);
s.writeFields();
Node<K,V>[] t;
-Chris.
> On Mon, Mar 23, 2015 at 2:25 PM, Mark Sheppard
<[email protected] <mailto:[email protected]>>
> wrote:
>
>> Hi
>> please oblige and review the following fix
>>
>> http://cr.openjdk.java.net/~msheppar/8068721/jdk9/corba/webrev/
>> http://cr.openjdk.java.net/~msheppar/8068721/jdk9/jdk/webrev/
>>
>>
>> which addresses the issue in
>> https://bugs.openjdk.java.net/browse/JDK-8068721
>>
>> This relates to RMI-IIOP and the interplay between custom
marshalling of
>> ValueTypes and
>> the corresponding serialization of a Java class, in this case
>> ConcurrentHashMap.
>>
>> ConcurrentHashMap changed its structure in jdk8 from that used
in jdk7.
>> This resulted in modification to the readObject and writeObject
methods,
>> and in particular, former serial fields were removed, resulting in
>> writeObject using PutField and readObject using defaultReadObject.
>> The writeObject invokes the putFields method of an
ObjectOutputStream
>> multiple times, and assumes
>> that it will receive the same PutField object instance for each
>> invocation. The spec
>> doesn't endorse this behaviour - but that's what the
implementation of
>> ObjectOutputStream
>> provides. However in the case of RMI-IIOP, the OutputStreamHook, a
>> subclass of ObjectOutputStream, returns a new instance for each
>> putFields invocation. Thus, the ConcurrentHashMap writeObject
results in
>> improper serialization in the context
>> of RMI-IIOP.
>> In the unmarshalling flow of ConcurrentHashMap, the readObject
now uses
>> defaultReadObject rather than GetField
>> In this call flow the IIOPInputStream attempts to ignore any
primitive
>> field, in the serialized data, that doesn't
>> correspond with a reflective field of the object. However, it
leaves the
>> primitive in the stream.
>> Thus, in the case of ConcurrentHashMap, which has serialized two
integers
>> and
>> an array of Segments (subclass of ReentrantLock), this results
in erroneous
>> deserialization, with a misinterpretation of a value tag in the
stream as
>> an array length
>> and an attempt to allocate a very large array ensues, with an
exception
>> being thrown.
>>
>> The OutputStreamHook now returns the same instance of PutField
allocated
>> for each separate call of putFields method.
>> This highlights a need to tighten up and disambiguate the
>> OutputObjectStream spec for putFields.
>>
>> IIOPInputStream now removes the primitive values from the stream.
>>
>> regards
>> Mark
>>