adoroszlai commented on code in PR #9952:
URL: https://github.com/apache/ozone/pull/9952#discussion_r2971198679


##########
hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/ha/io/ScmCodecFactory.java:
##########
@@ -74,35 +75,36 @@ public final class ScmCodecFactory {
     putEnum(NodeType.class, NodeType::forNumber);
 
     // Must be the last one
-    final ClassResolver resolver = new ClassResolver(codecs.keySet());
+    resolver = new ClassResolver(codecs.keySet());
     codecs.put(List.class, new ScmListCodec(resolver));
   }
 
-  static <T extends Message> void putProto(T proto) {
+  private <T extends Message> void putProto(T proto) {
     final Class<? extends Message> clazz = proto.getClass();
     codecs.put(clazz, new 
ScmNonShadedGeneratedMessageCodec<>(clazz.getSimpleName(), 
proto.getParserForType()));
   }
 
-  static <T extends Enum<T> & ProtocolMessageEnum> void putEnum(
+  private <T extends Enum<T> & ProtocolMessageEnum> void putEnum(
       Class<T> enumClass, IntFunction<T> forNumber) {
     codecs.put(enumClass, new ScmEnumCodec<>(enumClass, forNumber));
   }
 
-  private ScmCodecFactory() { }
+  public static ScmCodecFactory getInstance() {
+    return INSTANCE;
+  }
 
-  public static ScmCodec getCodec(Class<?> type)
+  public Class<?> resolve(String className)
       throws InvalidProtocolBufferException {
-    final List<Class<?>> classes = new ArrayList<>();
-    classes.add(type);
-    classes.addAll(ClassUtils.getAllSuperclasses(type));
-    classes.addAll(ClassUtils.getAllInterfaces(type));

Review Comment:
   I think removing supertypes causes the following error:
   
   ```
   org.apache.hadoop.hdds.scm.exceptions.SCMException: 
org.apache.ratis.thirdparty.com.google.protobuf.InvalidProtocolBufferException: 
Codec not found for class java.util.ArrayList
        at 
org.apache.hadoop.hdds.scm.ha.SCMHAInvocationHandler.translateException(SCMHAInvocationHandler.java:163)
        at 
org.apache.hadoop.hdds.scm.ha.SCMHAInvocationHandler.invokeRatis(SCMHAInvocationHandler.java:113)
        at 
org.apache.hadoop.hdds.scm.ha.SCMHAInvocationHandler.invoke(SCMHAInvocationHandler.java:72)
        at jdk.proxy2/jdk.proxy2.$Proxy41.addTransactionsToDB(Unknown Source)
        at 
org.apache.hadoop.hdds.scm.block.SCMDeletedBlockTransactionStatusManager.addTransactions(SCMDeletedBlockTransactionStatusManager.java:470)
        at 
org.apache.hadoop.hdds.scm.block.DeletedBlockLogImpl.addTransactions(DeletedBlockLogImpl.java:253)
        at 
org.apache.hadoop.ozone.shell.TestDeletedBlocksTxnShell.testGetDeletedBlockSummarySubcommand(TestDeletedBlocksTxnShell.java:178)
        at java.base/java.lang.reflect.Method.invoke(Method.java:580)
        at 
java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387)
        at 
java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312)
        at 
java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843)
        at 
java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808)
        at 
java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188)
   Caused by: 
org.apache.ratis.thirdparty.com.google.protobuf.InvalidProtocolBufferException: 
Codec not found for class java.util.ArrayList
        at 
org.apache.hadoop.hdds.scm.ha.io.ScmCodecFactory.getCodec(ScmCodecFactory.java:107)
        at 
org.apache.hadoop.hdds.scm.ha.SCMRatisRequest.encode(SCMRatisRequest.java:107)
        at 
org.apache.hadoop.hdds.scm.ha.SCMRatisServerImpl.submitRequest(SCMRatisServerImpl.java:238)
        at 
org.apache.hadoop.hdds.scm.ha.SCMHAInvocationHandler.invokeRatisServer(SCMHAInvocationHandler.java:121)
        at 
org.apache.hadoop.hdds.scm.ha.SCMHAInvocationHandler.invokeRatis(SCMHAInvocationHandler.java:110)
   ```
   
   Maybe we can fix it by changing `ArrayList` parameter to `List` in 
`DeletedBlockLogStateManager` and its implementation:
   
   
https://github.com/apache/ozone/blob/6cdc9fcc57bb57aa3a49b659a42b928cb4811fa8/hadoop-hdds/server-scm/src/main/java/org/apache/hadoop/hdds/scm/block/DeletedBlockLogStateManager.java#L32-L56
   
   Declaring as `List` is good practice in any case.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to