Thanks Balaji. Verified that is working! I ll send in the patch soon. On Mon, Apr 8, 2019 at 2:06 PM Balaji Varadarajan <[email protected]> wrote:
> > Hi Sudha, > It looks like the missing class is from a different jar which was not > included in hoodie-utilities. Hoodie Utilities shading uses inclusion-type > filter unlike shading for other bundles and hence this discrepancy. > If you add the following line in hoodie-utilities pom, it should hopefully > work. > > > <include>com.twitter:chill_2.11</include> > > + <include>com.twitter:chill-java</include> > > Balaji.V > > On Thursday, April 4, 2019, 10:58:56 PM PDT, Bhavani Sudha > Saktheeswaran <[email protected]> wrote: > > Adding hoodie-spark bundle and hoodie-utilities bundle to Spark jars fixes > this issue. I ll send a patch to fix this. Thanks everyone! > > -Sudha > > > > On Wed, Apr 3, 2019 at 3:19 PM Omkar Joshi <[email protected]> wrote: > > > Sudha, > > > > Try If this is passing for you "mvn clean integration-test". > > > > Thanks, > > Omkar > > > > On Wed, Apr 3, 2019 at 2:57 PM Bhavani Sudha Saktheeswaran > > <[email protected]> wrote: > > > > > Sure. Thanks! I ll update if I find anything! > > > > > > On Wed, Apr 3, 2019 at 2:54 PM Omkar Joshi <[email protected]> > > wrote: > > > > > > > Hi Sudha, > > > > > > > > I haven't tried it via Docker. Let me try it sometime this week or > > early > > > > next week. > > > > > > > > On Wed, Apr 3, 2019 at 2:35 PM Bhavani Sudha Saktheeswaran > > > > <[email protected]> wrote: > > > > > > > > > Hi Omkar, > > > > > > > > > > I am running the docker demo using the instructions here - > > > > > https://hudi.apache.org/docker_demo.html. I get this exception > when > > > > doing > > > > > Step5: Upsert of data using Delta Streamer. May be Docker set up is > > > > picking > > > > > old version of the jars ? You can reproduce it in master. > > > > > > > > > > Thanks, > > > > > Sudha > > > > > > > > > > On Wed, Apr 3, 2019 at 11:43 AM [email protected] <[email protected]> > > wrote: > > > > > > > > > > > Sudha, > > > > > > > > > > > > How are you using the hudi library? Are using bundled jar or > > > something > > > > > > else? > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > packaging/hoodie-presto-bundle/target/hoodie-presto-bundle-0.4.6-SNAPSHOT.jar > > > > > > > > > > > > omkar-C02T60PVG8WL:hoodie omkar$ jar -tvf > > > > > > > > > > > > > > > > > > > > > packaging/hoodie-presto-bundle/target/hoodie-presto-bundle-0.4.6-SNAPSHOT.jar > > > > > > | grep "KryoInstantiator" > > > > > > 569 Tue Mar 26 18:44:50 PDT 2019 > > > > > > > > com/uber/hoodie/com/twitter/chill/ScalaKryoInstantiator$$anon$1.class > > > > > > 1561 Tue Mar 26 18:44:50 PDT 2019 > > > > > > > com/uber/hoodie/com/twitter/chill/EmptyScalaKryoInstantiator.class > > > > > > 1953 Tue Mar 26 18:44:50 PDT 2019 > > > > > > com/uber/hoodie/com/twitter/chill/ScalaKryoInstantiator$.class > > > > > > 1992 Tue Mar 26 18:44:50 PDT 2019 > > > > > > com/uber/hoodie/com/twitter/chill/ScalaKryoInstantiator.class > > > > > > 859 Tue Mar 26 18:44:52 PDT 2019 > > > > > > com/uber/hoodie/com/twitter/chill/KryoInstantiator$1.class > > > > > > 845 Tue Mar 26 18:44:52 PDT 2019 > > > > > > com/uber/hoodie/com/twitter/chill/KryoInstantiator$3.class > > > > > > 650 Tue Mar 26 18:44:52 PDT 2019 > > > > > > > > > > > > > > > > > > > > > com/uber/hoodie/com/twitter/chill/config/ConfiguredInstantiator$CachedKryoInstantiator.class > > > > > > 2107 Tue Mar 26 18:44:52 PDT 2019 > > > > > > com/uber/hoodie/com/twitter/chill/KryoInstantiator.class > > > > > > 863 Tue Mar 26 18:44:52 PDT 2019 > > > > > > com/uber/hoodie/com/twitter/chill/KryoInstantiator$4.class > > > > > > 958 Tue Mar 26 18:44:52 PDT 2019 > > > > > > com/uber/hoodie/com/twitter/chill/KryoInstantiator$2.class > > > > > > 920 Tue Mar 26 18:44:52 PDT 2019 > > > > > > com/uber/hoodie/com/twitter/chill/KryoInstantiator$5.class > > > > > > 975 Tue Mar 26 18:44:52 PDT 2019 > > > > > > com/uber/hoodie/com/twitter/chill/KryoInstantiator$6.class > > > > > > > > > > > > On 2019/04/03 05:16:39, Bhavani Sudha Saktheeswaran > > > > > > <[email protected]> wrote: > > > > > > > Hi, > > > > > > > > > > > > > > I am getting this error when trying to ingest the second batch > of > > > > data > > > > > ( > > > > > > > upsets) into COW dataset. Looks like the KryoInstantiator is > > > missing > > > > in > > > > > > the > > > > > > > jars. Is this something that needs to be added to classpath > > > > separately > > > > > ? > > > > > > > > > > > > > > 2019-04-02 21:36:23 ERROR HoodieCopyOnWriteTable:274 - Error > > > > upserting > > > > > > > bucketType UPDATE for partition :0 > > > > > > > java.lang.NoClassDefFoundError: > > > > > > > com/uber/hoodie/com/twitter/chill/KryoInstantiator > > > > > > > at java.lang.ClassLoader.defineClass1(Native Method) > > > > > > > at java.lang.ClassLoader.defineClass(ClassLoader.java:763) > > > > > > > at > > > > > > > > > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > > > > > > > ... > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > com.uber.hoodie.common.util.SerializationUtils.serialize(SerializationUtils.java:50) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > com.uber.hoodie.common.util.collection.DiskBasedMap.put(DiskBasedMap.java:169) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > com.uber.hoodie.common.util.collection.ExternalSpillableMap.put(ExternalSpillableMap.java:169) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > com.uber.hoodie.common.util.collection.ExternalSpillableMap.put(ExternalSpillableMap.java:42) > > > > > > > at com.uber.hoodie.io > > > > > .HoodieMergeHandle.init(HoodieMergeHandle.java:159) > > > > > > > at com.uber.hoodie.io > > > > > > .HoodieMergeHandle.<init>(HoodieMergeHandle.java:73) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > com.uber.hoodie.table.HoodieCopyOnWriteTable.getUpdateHandle(HoodieCopyOnWriteTable.java:230) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > com.uber.hoodie.table.HoodieCopyOnWriteTable.handleUpdate(HoodieCopyOnWriteTable.java:184) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > com.uber.hoodie.table.HoodieCopyOnWriteTable.handleUpsertPartition(HoodieCopyOnWriteTable.java:267) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > com.uber.hoodie.HoodieWriteClient.lambda$upsertRecordsInternal$7ef77fd$1(HoodieWriteClient.java:440) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.spark.api.java.JavaRDDLike$$anonfun$mapPartitionsWithIndex$1.apply(JavaRDDLike.scala:102) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.spark.api.java.JavaRDDLike$$anonfun$mapPartitionsWithIndex$1.apply(JavaRDDLike.scala:102) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:847) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.spark.rdd.RDD$$anonfun$mapPartitionsWithIndex$1$$anonfun$apply$26.apply(RDD.scala:847) > > > > > > > at > > > > > > > > > > > > org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) > > > > > > > at > > org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) > > > > > > > at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) > > > > > > > at > > > > > > > > > > > > org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) > > > > > > > at > > org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) > > > > > > > at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337) > > > > > > > at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1109) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1083) > > > > > > > at > > > > org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1018) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1083) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:809) > > > > > > > at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335) > > > > > > > at org.apache.spark.rdd.RDD.iterator(RDD.scala:286) > > > > > > > at > > > > > > > > > > > > org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) > > > > > > > at > > org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) > > > > > > > at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) > > > > > > > at > > > org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) > > > > > > > at org.apache.spark.scheduler.Task.run(Task.scala:109) > > > > > > > at > > > > > > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) > > > > > > > at > > > > > > > > > > > > > > > > > > > > > > > > > > > > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) > > > > > > > at java.lang.Thread.run(Thread.java:748) > > > > > > > Caused by: java.lang.ClassNotFoundException: > > > > > > > com.uber.hoodie.com.twitter.chill.KryoInstantiator > > > > > > > at java.net.URLClassLoader.findClass(URLClassLoader.java:382) > > > > > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:424) > > > > > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:357) > > > > > > > > > > > > > > Thanks, > > > > > > > Sudha > > > > > > > > > > > > > > > > > > > > > > > > > > > >
