galadrielwithlaptop commented on issue #9495:
URL: https://github.com/apache/hudi/issues/9495#issuecomment-1752659628

   A dstream job which writes data in hudi format to ABFS directory fails 
saying that “Caused by: java.lang.NoClassDefFoundError: Could not initialize 
class org.apache.hudi.org.apache.hadoop.hbase.io.hfile.HFile”
   PS: HFile class was present in the Dstream jar we submitted 
   
   How we solved it?
   1.   We used this solution suggested by Hudi Community.
   
https://hudi.apache.org/docs/faq/#how-can-i-resolve-the-nosuchmethoderror-from-hbase-when-using-hudi-with-metadata-table-on-hdfs
   But, to no avail.
   2.   We compiled hudi with the same Hadoop version what flink 1.16.0 uses, 
which is Hadoop 3.3.2 and Hive3 profile. But to no avail.
   3.   Used some configs like classloading-order to parent-first. No avail.
   4.   We then kept the same hudi-flink jar which is compiled with Hadoop 
3.3.2 and Hbase 2.4.9. And add it in the server classpath. 
   There was a case of dynamic class loading issue in flink. This class 
“org.apache.hudi.org.apache.hadoop.hbase.io.hfile.HFile” was a shaded class, 
and flink or java has issues loading these shaded class dynamic class-loading.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to