rahulpoptani commented on issue #2180:
URL: https://github.com/apache/hudi/issues/2180#issuecomment-717046074


   @vinothchandar I'm not facing jetty issue but I gave the master built a try 
and got another issue while reading (both snapshot and read optimized)
   Here is the error:
   `java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
        at 
org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex.<clinit>(HFileBootstrapIndex.java:92)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at 
org.apache.hudi.common.util.ReflectionUtils.getClass(ReflectionUtils.java:53)
        at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:87)
        at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:98)
        at 
org.apache.hudi.common.bootstrap.index.BootstrapIndex.getBootstrapIndex(BootstrapIndex.java:159)
        at 
org.apache.hudi.common.table.view.AbstractTableFileSystemView.init(AbstractTableFileSystemView.java:100)
        at 
org.apache.hudi.common.table.view.HoodieTableFileSystemView.init(HoodieTableFileSystemView.java:95)
        at 
org.apache.hudi.common.table.view.HoodieTableFileSystemView.<init>(HoodieTableFileSystemView.java:89)
        at 
org.apache.hudi.common.table.view.HoodieTableFileSystemView.<init>(HoodieTableFileSystemView.java:80)
        at 
org.apache.hudi.common.table.view.HoodieTableFileSystemView.<init>(HoodieTableFileSystemView.java:130)
        at 
org.apache.hudi.hadoop.HoodieROTablePathFilter.accept(HoodieROTablePathFilter.java:167)
        at 
org.apache.spark.sql.execution.datasources.InMemoryFileIndex$$anonfun$16.apply(InMemoryFileIndex.scala:353)
        at 
org.apache.spark.sql.execution.datasources.InMemoryFileIndex$$anonfun$16.apply(InMemoryFileIndex.scala:353)
        at 
scala.collection.TraversableLike$$anonfun$filterImpl$1.apply(TraversableLike.scala:248)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at 
scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
        at 
scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
        at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
        at 
org.apache.spark.sql.execution.datasources.InMemoryFileIndex$.org$apache$spark$sql$execution$datasources$InMemoryFileIndex$$listLeafFiles(InMemoryFileIndex.scala:353)
        at 
org.apache.spark.sql.execution.datasources.InMemoryFileIndex$$anonfun$bulkListLeafFiles$2.apply(InMemoryFileIndex.scala:264)
        at 
org.apache.spark.sql.execution.datasources.InMemoryFileIndex$$anonfun$bulkListLeafFiles$2.apply(InMemoryFileIndex.scala:263)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
        at scala.collection.AbstractTraversable.map(Traversable.scala:104)
        at 
org.apache.spark.sql.execution.datasources.InMemoryFileIndex$.bulkListLeafFiles(InMemoryFileIndex.scala:263)
        at 
org.apache.spark.sql.execution.datasources.InMemoryFileIndex.listLeafFiles(InMemoryFileIndex.scala:130)
        at 
org.apache.spark.sql.execution.datasources.InMemoryFileIndex.refresh0(InMemoryFileIndex.scala:94)
        at 
org.apache.spark.sql.execution.datasources.InMemoryFileIndex.<init>(InMemoryFileIndex.scala:70)
        at 
org.apache.spark.sql.execution.datasources.DataSource.org$apache$spark$sql$execution$datasources$DataSource$$createInMemoryFileIndex(DataSource.scala:589)
        at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:419)
        at 
org.apache.hudi.DefaultSource.getBaseFileOnlyView(DefaultSource.scala:172)
        at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:93)
        at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:51)
        at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:351)
        at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:311)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:297)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:214)`
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to