ashishmgofficial removed a comment on issue #2149:
URL: https://github.com/apache/hudi/issues/2149#issuecomment-704569933


   @bvaradar I implemented the transformer class as
   
   ```public class DebeziumCustomTransformer implements Transformer {
     private static final Logger LOG = 
LogManager.getLogger(DebeziumCustomTransformer.class);
     @Override
     public Dataset apply(JavaSparkContext jsc, SparkSession sparkSession, 
Dataset<Row> rowDataset,
         TypedProperties properties) {
       
       Dataset<Row> insertedOrUpdatedData = rowDataset
           .select("op", "ts_ms", "after.*")
           .withColumnRenamed("op", "_op")
           .withColumn("_is_hoodie_deleted",lit("false"))
           .withColumnRenamed("ts_ms", "_ts_ms")
           .filter(rowDataset.col("op").notEqual("d"));
       
       Dataset<Row> deletedData = rowDataset
           .select("op", "ts_ms", "before.*")
           .withColumnRenamed("op", "_op")
           .withColumn("_is_hoodie_deleted",lit("true"))
           .withColumnRenamed("ts_ms", "_ts_ms")
           .filter(rowDataset.col("op").equalTo("d"));
       
       Dataset<Row> transformedData = insertedOrUpdatedData.union(deletedData);
       return transformedData;
     }
   }
   ```
   
   But Im facing the following error
   
   ```
   Exception in thread "main" java.lang.NoSuchMethodError: 
scala.Product.$init$(Lscala/Product;)V
           at 
org.apache.spark.sql.avro.SchemaConverters$SchemaType.<init>(SchemaConverters.scala:40)
           at 
org.apache.spark.sql.avro.SchemaConverters$.toSqlTypeHelper(SchemaConverters.scala:53)
           at 
org.apache.spark.sql.avro.SchemaConverters$.$anonfun$toSqlTypeHelper$1(SchemaConverters.scala:82)
           at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
           at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
           at scala.collection.Iterator$class.foreach(Iterator.scala:891)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
           at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
           at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
           at 
scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
           at scala.collection.AbstractTraversable.map(Traversable.scala:104)
           at 
org.apache.spark.sql.avro.SchemaConverters$.toSqlTypeHelper(SchemaConverters.scala:81)
           at 
org.apache.spark.sql.avro.SchemaConverters$.toSqlTypeHelper(SchemaConverters.scala:105)
           at 
org.apache.spark.sql.avro.SchemaConverters$.$anonfun$toSqlTypeHelper$1(SchemaConverters.scala:82)
           at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
           at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
           at scala.collection.Iterator$class.foreach(Iterator.scala:891)
           at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
           at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
           at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
           at 
scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
           at scala.collection.AbstractTraversable.map(Traversable.scala:104)
           at 
org.apache.spark.sql.avro.SchemaConverters$.toSqlTypeHelper(SchemaConverters.scala:81)
           at 
org.apache.spark.sql.avro.SchemaConverters$.toSqlType(SchemaConverters.scala:46)
           at 
org.apache.hudi.AvroConversionUtils$.convertAvroSchemaToStructType(AvroConversionUtils.scala:81)
           at 
org.apache.hudi.AvroConversionUtils$.createDataFrame(AvroConversionUtils.scala:70)
           at 
org.apache.hudi.AvroConversionUtils.createDataFrame(AvroConversionUtils.scala)
           at 
org.apache.hudi.utilities.deltastreamer.SourceFormatAdapter.lambda$fetchNewDataInRowFormat$2(SourceFormatAdapter.java:101)
           at org.apache.hudi.common.util.Option.map(Option.java:104)
           at 
org.apache.hudi.utilities.deltastreamer.SourceFormatAdapter.fetchNewDataInRowFormat(SourceFormatAdapter.java:101)
           at 
org.apache.hudi.utilities.deltastreamer.DeltaSync.readFromSource(DeltaSync.java:307)
           at 
org.apache.hudi.utilities.deltastreamer.DeltaSync.syncOnce(DeltaSync.java:238)
           at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.lambda$sync$2(HoodieDeltaStreamer.java:163)
           at org.apache.hudi.common.util.Option.ifPresent(Option.java:96)
           at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.sync(HoodieDeltaStreamer.java:161)
           at 
org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer.main(HoodieDeltaStreamer.java:466)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:853)
           at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
           at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:928)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:937)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to