chaokunyang opened a new issue, #1658:
URL: https://github.com/apache/incubator-fury/issues/1658

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/incubator-fury/issues) and found no similar 
issues.
   
   
   ### Version
   
   0.5.1
   
   ### Component(s)
   
   Java
   
   ### Minimal reproduce step
   
   ```java
   public class SparkTypeTest extends TestBase {
     @Test(dataProvider = "enableCodegen")
     public void testObjectType(boolean enableCodegen) {
       Fury fury = 
builder().withRefTracking(true).withCodegen(enableCodegen).build();
       fury.serialize(DecimalType$.MODULE$);
       fury.serialize(new DecimalType(10, 10));
     }
   }
   ```
   
   Spark 2.4.1
   
   ### What did you expect to see?
   
   No compile error
   
   ### What did you see instead?
   
   ```
   java.lang.RuntimeException: Create sequential serializer failed, 
   class: class org.apache.spark.sql.types.DecimalType
        at 
org.apache.fury.serializer.CodegenSerializer.loadCodegenSerializer(CodegenSerializer.java:52)
        at 
org.apache.fury.resolver.ClassResolver.lambda$getObjectSerializerClass$2(ClassResolver.java:954)
        at 
org.apache.fury.builder.JITContext.registerSerializerJITCallback(JITContext.java:131)
        at 
org.apache.fury.resolver.ClassResolver.getObjectSerializerClass(ClassResolver.java:952)
        at 
org.apache.fury.resolver.ClassResolver.getSerializerClass(ClassResolver.java:885)
        at 
org.apache.fury.resolver.ClassResolver.getSerializerClass(ClassResolver.java:782)
        at 
org.apache.fury.resolver.ClassResolver.createSerializer(ClassResolver.java:1168)
        at 
org.apache.fury.resolver.ClassResolver.getClassInfo(ClassResolver.java:1062)
        at 
org.apache.spark.sql.types.StructFieldFuryRefCodec_1_666988784_1856050011.writeClassAndObject$(StructFieldFuryRefCodec_1_666988784_1856050011.java:52)
        at 
org.apache.spark.sql.types.StructFieldFuryRefCodec_1_666988784_1856050011.writeFields$(StructFieldFuryRefCodec_1_666988784_1856050011.java:74)
        at 
org.apache.spark.sql.types.StructFieldFuryRefCodec_1_666988784_1856050011.write(StructFieldFuryRefCodec_1_666988784_1856050011.java:119)
        at org.apache.fury.Fury.writeNonRef(Fury.java:441)
        at 
org.apache.fury.serializer.ArraySerializers$ObjectArraySerializer.write(ArraySerializers.java:104)
        at 
org.apache.fury.serializer.ArraySerializers$ObjectArraySerializer.write(ArraySerializers.java:42)
        at 
org.apache.spark.sql.types.StructTypeFuryRefCodec_1_666988784_1735054408.writeClassAndObject$(StructTypeFuryRefCodec_1_666988784_1735054408.java:71)
        at 
org.apache.spark.sql.types.StructTypeFuryRefCodec_1_666988784_1735054408.writeFields$(StructTypeFuryRefCodec_1_666988784_1735054408.java:137)
        at 
org.apache.spark.sql.types.StructTypeFuryRefCodec_1_666988784_1735054408.write(StructTypeFuryRefCodec_1_666988784_1735054408.java:364)
        at 
org.apache.fury.serializer.collection.CollectionSerializers$DefaultJavaCollectionSerializer.write(CollectionSerializers.java:541)
        at org.apache.fury.Fury.writeNonRef(Fury.java:441)
        at 
org.apache.fury.serializer.ArraySerializers$ObjectArraySerializer.write(ArraySerializers.java:104)
        at 
org.apache.fury.serializer.ArraySerializers$ObjectArraySerializer.write(ArraySerializers.java:42)
        at org.apache.fury.Fury.writeData(Fury.java:550)
        at org.apache.fury.Fury.write(Fury.java:314)
        at org.apache.fury.Fury.serialize(Fury.java:248)
        at org.apache.fury.Fury.serialize(Fury.java:220)
        at org.apache.fury.ThreadLocalFury.serialize(ThreadLocalFury.java:91)
        at 
com.alibaba.sparklib.FurySerializerInstance.serialize(FurySerializer.scala:35)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:555)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:955)
   Caused by: org.apache.fury.codegen.CodegenException: Compile error: 
   org.apache.spark.sql.types.DecimalTypeFuryRefCodec_1_666988784_533377181:
        at org.apache.fury.codegen.JaninoUtils.toBytecode(JaninoUtils.java:133)
        at org.apache.fury.codegen.JaninoUtils.toBytecode(JaninoUtils.java:73)
        at org.apache.fury.codegen.CodeGenerator.compile(CodeGenerator.java:145)
        at 
org.apache.fury.builder.CodecUtils.loadOrGenCodecClass(CodecUtils.java:110)
        at 
org.apache.fury.builder.CodecUtils.loadOrGenObjectCodecClass(CodecUtils.java:43)
        at 
org.apache.fury.serializer.CodegenSerializer.loadCodegenSerializer(CodegenSerializer.java:49)
        ... 30 more
   Caused by: 
org.apache.fury.shaded.org.codehaus.commons.compiler.CompileException: File 
'org/apache/spark/sql/types/DecimalTypeFuryRefCodec_1_666988784_533377181.java',
 Line 28, Column 21: IDENTIFIER expected instead of '2'
        at 
org.apache.fury.shaded.org.codehaus.janino.TokenStreamImpl.read(TokenStreamImpl.java:195)
        at 
org.apache.fury.shaded.org.codehaus.janino.Parser.read(Parser.java:3804)
        at 
org.apache.fury.shaded.org.codehaus.janino.Parser.parseClassBodyDeclaration(Parser.java:1040)
        at 
org.apache.fury.shaded.org.codehaus.janino.Parser.parseClassBody(Parser.java:856)
        at 
org.apache.fury.shaded.org.codehaus.janino.Parser.parseClassDeclarationRest(Parser.java:746)
        at 
org.apache.fury.shaded.org.codehaus.janino.Parser.parsePackageMemberTypeDeclarationRest(Parser.java:492)
        at 
org.apache.fury.shaded.org.codehaus.janino.Parser.parseAbstractCompilationUnit(Parser.java:267)
        at 
org.apache.fury.shaded.org.codehaus.janino.Compiler.parseAbstractCompilationUnit(Compiler.java:316)
        at 
org.apache.fury.shaded.org.codehaus.janino.Compiler.compile2(Compiler.java:236)
        at 
org.apache.fury.shaded.org.codehaus.janino.Compiler.compile(Compiler.java:213)
        at org.apache.fury.codegen.JaninoUtils.toBytecode(JaninoUtils.java:115)
        ... 35 more
   
   ```
   
   ```java
   /* 0001 */ package org.apache.spark.sql.types;
   /* 0002 */ 
   /* 0003 */ import org.apache.fury.Fury;
   /* 0004 */ import org.apache.fury.memory.MemoryBuffer;
   /* 0005 */ import org.apache.fury.resolver.MapRefResolver;
   /* 0006 */ import org.apache.fury.memory.Platform;
   /* 0007 */ import org.apache.fury.resolver.ClassInfo;
   /* 0008 */ import org.apache.fury.resolver.ClassInfoHolder;
   /* 0009 */ import org.apache.fury.resolver.ClassResolver;
   /* 0010 */ import org.apache.fury.builder.Generated;
   /* 0011 */ import 
org.apache.fury.serializer.CodegenSerializer.LazyInitBeanSerializer;
   /* 0012 */ import org.apache.fury.serializer.Serializers.EnumSerializer;
   /* 0013 */ import org.apache.fury.serializer.Serializer;
   /* 0014 */ import org.apache.fury.serializer.StringSerializer;
   /* 0015 */ import org.apache.fury.serializer.ObjectSerializer;
   /* 0016 */ import org.apache.fury.serializer.CompatibleSerializer;
   /* 0017 */ import 
org.apache.fury.serializer.collection.AbstractCollectionSerializer;
   /* 0018 */ import 
org.apache.fury.serializer.collection.AbstractMapSerializer;
   /* 0019 */ import 
org.apache.fury.builder.Generated.GeneratedObjectSerializer;
   /* 0020 */ 
   /* 0021 */ public final class DecimalTypeFuryRefCodec_1_666988784_533377181 
extends GeneratedObjectSerializer {
       /* 0022 */ 
       /* 0023 */   private final MapRefResolver refResolver;
       /* 0024 */   private final ClassResolver classResolver;
       /* 0025 */   private final StringSerializer strSerializer;
       /* 0026 */   private Fury fury;
       /* 0027 */   private ClassInfo ClassInfo;
       /* 0028 */   private ClassInfo 2ClassInfo;
       /* 0029 */   private ClassInfo 4ClassInfo;
       /* 0030 */   private ClassInfo 6ClassInfo;
       /* 0031 */   private final ClassInfoHolder 8ClassInfoHolder;
       /* 0032 */   private final ClassInfoHolder 9ClassInfoHolder;
       /* 0033 */   private final ClassInfoHolder 10ClassInfoHolder;
   /* 0034 */   private final ClassInfoHolder 11ClassInfoHolder;
   /* 0035 */ 
   /* 0036 */   public DecimalTypeFuryRefCodec_1_666988784_533377181(Fury fury, 
Class classType) {
       /* 0037 */       super(fury, classType);
       /* 0038 */       this.fury = fury;
       /* 0039 */       
fury.getClassResolver().setSerializerIfAbsent(classType, this);
   /* 0040 */   
   /* 0041 */       org.apache.fury.resolver.RefResolver refResolver0 = 
fury.getRefResolver();
   /* 0042 */       refResolver = ((MapRefResolver)refResolver0);
   /* 0043 */       classResolver = fury.getClassResolver();
   /* 0044 */       strSerializer = fury.getStringSerializer();
   /* 0045 */       ClassInfo = classResolver.nilClassInfo();
   /* 0046 */       2ClassInfo = classResolver.nilClassInfo();
   /* 0047 */       4ClassInfo = classResolver.nilClassInfo();
   /* 0048 */       6ClassInfo = classResolver.nilClassInfo();
   /* 0049 */       8ClassInfoHolder = classResolver.nilClassInfoHolder();
   /* 0050 */       9ClassInfoHolder = classResolver.nilClassInfoHolder();
   /* 0051 */       10ClassInfoHolder = classResolver.nilClassInfoHolder();
   /* 0052 */       11ClassInfoHolder = classResolver.nilClassInfoHolder();
   /* 0053 */   }
   /* 0054 */ 
   /* 0055 */   private void writeClassAndObject(MemoryBuffer memoryBuffer, 
org.apache.spark.sql.types.Decimal. 1) {
   /* 0056 */       ClassResolver classResolver = this.classResolver;
   /* 0057 */       Class value = ClassInfo.getCls();
   /* 0058 */       Class cls = 1.getClass();
   /* 0059 */       if ((value != cls)) {
   /* 0060 */           ClassInfo = classResolver.getClassInfo(cls);
   /* 0061 */       }
   /* 0062 */       classResolver.writeClass(memoryBuffer, ClassInfo);
   /* 0063 */       ClassInfo.getSerializer().write(memoryBuffer, 1);
   /* 0064 */   }
   /* 0065 */ 
   /* 0066 */   private void writeClassAndObject1(MemoryBuffer memoryBuffer1, 
org.apache.spark.sql.types.Decimal. 3) {
   /* 0067 */       ClassResolver classResolver = this.classResolver;
   /* 0068 */       Class value0 = 2ClassInfo.getCls();
   /* 0069 */       Class cls0 = 3.getClass();
   /* 0070 */       if ((value0 != cls0)) {
   /* 0071 */           2ClassInfo = classResolver.getClassInfo(cls0);
   /* 0072 */       }
   /* 0073 */       classResolver.writeClass(memoryBuffer1, 2ClassInfo);
   /* 0074 */       2ClassInfo.getSerializer().write(memoryBuffer1, 3);
   /* 0075 */   }
   /* 0076 */ 
   /* 0077 */   private void writeClassAndObject2(MemoryBuffer memoryBuffer2, 
org.apache.spark.sql.types.Decimal. 5) {
   /* 0078 */       ClassResolver classResolver = this.classResolver;
   /* 0079 */       Class value1 = 4ClassInfo.getCls();
   /* 0080 */       Class cls1 = 5.getClass();
   /* 0081 */       if ((value1 != cls1)) {
   /* 0082 */           4ClassInfo = classResolver.getClassInfo(cls1);
   /* 0083 */       }
   /* 0084 */       classResolver.writeClass(memoryBuffer2, 4ClassInfo);
   /* 0085 */       4ClassInfo.getSerializer().write(memoryBuffer2, 5);
   /* 0086 */   }
   /* 0087 */ 
   /* 0088 */   private void writeClassAndObject3(MemoryBuffer memoryBuffer3, 
org.apache.spark.sql.types.Decimal. 7) {
   /* 0089 */       ClassResolver classResolver = this.classResolver;
   /* 0090 */       Class value2 = 6ClassInfo.getCls();
   /* 0091 */       Class cls2 = 7.getClass();
   /* 0092 */       if ((value2 != cls2)) {
   /* 0093 */           6ClassInfo = classResolver.getClassInfo(cls2);
   /* 0094 */       }
   /* 0095 */       classResolver.writeClass(memoryBuffer3, 6ClassInfo);
   /* 0096 */       6ClassInfo.getSerializer().write(memoryBuffer3, 7);
   /* 0097 */   }
   /* 0098 */ 
   /* 0099 */   private void writeFields(org.apache.spark.sql.types.DecimalType 
decimalType1, MemoryBuffer memoryBuffer4) {
   /* 0100 */       MapRefResolver refResolver = this.refResolver;
   /* 0101 */       Object object1 = Platform.getObject(decimalType1, 40L);
   /* 0102 */       org.apache.spark.sql.types.Decimal. asIntegral = 
(org.apache.spark.sql.types.Decimal.)object1;
   /* 0103 */       if ((!refResolver.writeRefOrNull(memoryBuffer4, 
asIntegral))) {
   /* 0104 */           this.writeClassAndObject(memoryBuffer4, asIntegral);
   /* 0105 */       }
   /* 0106 */       Object object22 = Platform.getObject(decimalType1, 32L);
   /* 0107 */       org.apache.spark.sql.types.Decimal. fractional = 
(org.apache.spark.sql.types.Decimal.)object22;
   /* 0108 */       if ((!refResolver.writeRefOrNull(memoryBuffer4, 
fractional))) {
   /* 0109 */           this.writeClassAndObject1(memoryBuffer4, fractional);
   /* 0110 */       }
   /* 0111 */       Object object33 = Platform.getObject(decimalType1, 28L);
   /* 0112 */       org.apache.spark.sql.types.Decimal. numeric = 
(org.apache.spark.sql.types.Decimal.)object33;
   /* 0113 */       if ((!refResolver.writeRefOrNull(memoryBuffer4, numeric))) {
   /* 0114 */           this.writeClassAndObject2(memoryBuffer4, numeric);
   /* 0115 */       }
   /* 0116 */       Object object44 = Platform.getObject(decimalType1, 36L);
   /* 0117 */       org.apache.spark.sql.types.Decimal. ordering = 
(org.apache.spark.sql.types.Decimal.)object44;
   /* 0118 */       if ((!refResolver.writeRefOrNull(memoryBuffer4, ordering))) 
{
   /* 0119 */           this.writeClassAndObject3(memoryBuffer4, ordering);
   /* 0120 */       }
   /* 0121 */   }
   /* 0122 */ 
   /* 0123 */   private void readFields(org.apache.spark.sql.types.DecimalType 
decimalType2, MemoryBuffer memoryBuffer5) {
   /* 0124 */       MapRefResolver refResolver = this.refResolver;
   /* 0125 */       ClassResolver classResolver = this.classResolver;
   /* 0126 */       int refId = refResolver.tryPreserveRefId(memoryBuffer5);
   /* 0127 */       if ((refId >= ((byte)-1))) {
   /* 0128 */           Object object0 = 
classResolver.readClassInfo(memoryBuffer5, 
8ClassInfoHolder).getSerializer().read(memoryBuffer5);
   /* 0129 */           refResolver.setReadObject(refId, object0);
   /* 0130 */           Platform.putObject(decimalType2, 40L, 
((org.apache.spark.sql.types.Decimal.)object0));
   /* 0131 */       } else {
   /* 0132 */           Platform.putObject(decimalType2, 40L, 
((org.apache.spark.sql.types.Decimal.)refResolver.getReadObject()));
   /* 0133 */       }
   /* 0134 */       int refId1 = refResolver.tryPreserveRefId(memoryBuffer5);
   /* 0135 */       if ((refId1 >= ((byte)-1))) {
   /* 0136 */           Object object5 = 
classResolver.readClassInfo(memoryBuffer5, 
9ClassInfoHolder).getSerializer().read(memoryBuffer5);
   /* 0137 */           refResolver.setReadObject(refId1, object5);
   /* 0138 */           Platform.putObject(decimalType2, 32L, 
((org.apache.spark.sql.types.Decimal.)object5));
   /* 0139 */       } else {
   /* 0140 */           Platform.putObject(decimalType2, 32L, 
((org.apache.spark.sql.types.Decimal.)refResolver.getReadObject()));
   /* 0141 */       }
   /* 0142 */       int refId2 = refResolver.tryPreserveRefId(memoryBuffer5);
   /* 0143 */       if ((refId2 >= ((byte)-1))) {
   /* 0144 */           Object object6 = 
classResolver.readClassInfo(memoryBuffer5, 
10ClassInfoHolder).getSerializer().read(memoryBuffer5);
   /* 0145 */           refResolver.setReadObject(refId2, object6);
   /* 0146 */           Platform.putObject(decimalType2, 28L, 
((org.apache.spark.sql.types.Decimal.)object6));
   /* 0147 */       } else {
   /* 0148 */           Platform.putObject(decimalType2, 28L, 
((org.apache.spark.sql.types.Decimal.)refResolver.getReadObject()));
   /* 0149 */       }
   /* 0150 */       int refId3 = refResolver.tryPreserveRefId(memoryBuffer5);
   /* 0151 */       if ((refId3 >= ((byte)-1))) {
   /* 0152 */           Object object7 = 
classResolver.readClassInfo(memoryBuffer5, 
11ClassInfoHolder).getSerializer().read(memoryBuffer5);
   /* 0153 */           refResolver.setReadObject(refId3, object7);
   /* 0154 */           Platform.putObject(decimalType2, 36L, 
((org.apache.spark.sql.types.Decimal.)object7));
   /* 0155 */       } else {
   /* 0156 */           Platform.putObject(decimalType2, 36L, 
((org.apache.spark.sql.types.Decimal.)refResolver.getReadObject()));
   /* 0157 */       }
   /* 0158 */   }
   /* 0159 */ 
   /* 0160 */   @Override public final void write(MemoryBuffer buffer, Object 
obj) {
   /* 0161 */       org.apache.spark.sql.types.DecimalType decimalType3 = 
(org.apache.spark.sql.types.DecimalType)obj;
   /* 0162 */       buffer.grow(16);
   /* 0163 */       byte[] base = buffer.getHeapMemory();
   /* 0164 */       buffer._unsafeWriteVarInt32(Platform.getInt(decimalType3, 
12L));
   /* 0165 */       buffer._unsafeWriteVarInt32(Platform.getInt(decimalType3, 
16L));
   /* 0166 */       this.writeFields(decimalType3, buffer);
   /* 0167 */   }
   /* 0168 */ 
   /* 0169 */   @Override public final Object read(MemoryBuffer buffer) {
   /* 0170 */       org.apache.spark.sql.types.DecimalType decimalType4 = new 
org.apache.spark.sql.types.DecimalType();
   /* 0171 */       refResolver.reference(decimalType4);
   /* 0172 */       byte[] heapBuffer = buffer.getHeapMemory();
   /* 0173 */       Platform.putInt(decimalType4, 12L, 
buffer._readVarInt32OnLE());
   /* 0174 */       Platform.putInt(decimalType4, 16L, 
buffer._readVarInt32OnLE());
   /* 0175 */       this.readFields(decimalType4, buffer);
   /* 0176 */       return decimalType4;
   /* 0177 */   }
   /* 0178 */ 
   /* 0179 */ }
   ```
   
   ### Anything Else?
   
   _No response_
   
   ### Are you willing to submit a PR?
   
   - [X] I'm willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to