szetszwo commented on code in PR #9550:
URL: https://github.com/apache/ozone/pull/9550#discussion_r2649967812


##########
hadoop-hdds/framework/src/test/java/org/apache/hadoop/hdds/utils/db/TestCodec.java:
##########
@@ -295,14 +298,13 @@ public static <T> void runTest(Codec<T> codec, T original,
   static <T> void runTestBytes(T object, Codec<T> codec) throws IOException {
     final byte[] array = codec.toPersistedFormat(object);
     final Bytes fromArray = new Bytes(array);
-
-    try (CodecBuffer buffer = codec.toCodecBuffer(object,
-        CodecBuffer.Allocator.HEAP)) {
-      final Bytes fromBuffer = new Bytes(buffer);
-
-      assertEquals(fromArray.hashCode(), fromBuffer.hashCode());
-      assertEquals(fromArray, fromBuffer);
-      assertEquals(fromBuffer, fromArray);
+    for (CodecBuffer.Allocator allocator : 
ImmutableList.of(CodecBuffer.Allocator.HEAP, CodecBuffer.Allocator.DIRECT)) {

Review Comment:
   Pass allocator instead of adding a loop.
   ```java
       runTestBytes(original, codec, CodecBuffer.Allocator.HEAP);
       runTestBytes(original, codec, CodecBuffer.Allocator.DIRECT);
     }
   
     static <T> void runTestBytes(T object, Codec<T> codec, 
CodecBuffer.Allocator allocator) throws IOException {
       final byte[] array = codec.toPersistedFormat(object);
       final Bytes fromArray = new Bytes(array);
   
       try (CodecBuffer buffer = codec.toCodecBuffer(object, allocator)) {
   ```



##########
hadoop-hdds/framework/src/main/java/org/apache/hadoop/hdds/utils/db/RDBBatchOperation.java:
##########
@@ -80,26 +81,33 @@ private static String countSize2String(int count, long 
size) {
    * To implement {@link #equals(Object)} and {@link #hashCode()}
    * based on the contents of the bytes.
    */
-  static final class Bytes {
-    private final byte[] array;
-    private final CodecBuffer buffer;
+  static final class Bytes implements Closeable {
+    private AbstractSlice<?> slice;
     /** Cache the hash value. */
-    private final int hash;
+    private int hash;
 
     Bytes(CodecBuffer buffer) {
-      this.array = null;
-      this.buffer = Objects.requireNonNull(buffer, "buffer == null");
-      this.hash = buffer.asReadOnlyByteBuffer().hashCode();
+      Objects.requireNonNull(buffer, "buffer == null");
+      if (buffer.isDirect()) {
+        initWithDirectByteBuffer(buffer.asReadOnlyByteBuffer());
+      } else {
+        initWithByteArray(buffer.getArray());
+      }
     }
 
     Bytes(byte[] array) {
-      this.array = array;
-      this.buffer = null;
+      Objects.requireNonNull(array, "array == null");
+      initWithByteArray(array);
+    }
+
+    private void initWithByteArray(byte[] array) {
+      this.slice = new ManagedSlice(array);
       this.hash = ByteBuffer.wrap(array).hashCode();
     }
 
-    ByteBuffer asReadOnlyByteBuffer() {
-      return buffer.asReadOnlyByteBuffer();
+    private void initWithDirectByteBuffer(ByteBuffer byteBuffer) {
+      this.slice = new ManagedDirectSlice(byteBuffer);
+      this.hash = byteBuffer.hashCode();
     }

Review Comment:
   Constructors should not call any functions.  Otherwise, the fields cannot be 
final, which is very useful for reducing bugs and for understanding the code..
   ```java
       private final AbstractSlice<?> slice;
       /** Cache the hash value. */
       private final int hash;
   
       static Bytes newBytes(CodecBuffer buffer) {
         Objects.requireNonNull(buffer, "buffer == null");
         return buffer.isDirect() ? new Bytes(buffer.asReadOnlyByteBuffer()) : 
new Bytes(buffer.getArray());
       }
   
       Bytes(ByteBuffer buffer) {
         Objects.requireNonNull(buffer, "buffer == null");
         assertTrue(buffer.isDirect(), "buffer is not direct");
         this.slice = new ManagedDirectSlice(buffer);
         this.hash = buffer.hashCode();
       }
   
       Bytes(byte[] array) {
         Objects.requireNonNull(array, "array == null");
         this.slice = new ManagedSlice(array);
         this.hash = ByteBuffer.wrap(array).hashCode();
       }
   ```



##########
hadoop-hdds/framework/src/main/java/org/apache/hadoop/hdds/utils/db/RDBBatchOperation.java:
##########
@@ -127,12 +131,21 @@ public int hashCode() {
 
     @Override
     public String toString() {
-      return array != null ? bytes2String(array)
-          : bytes2String(asReadOnlyByteBuffer());
+      return slice.toString();

Review Comment:
   Have you checked what will it return?  Could you show an example output?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to