stym06 commented on issue #4318:
URL: https://github.com/apache/hudi/issues/4318#issuecomment-997342110


   @nsivabalan any idea how I can start spark-shell with the azure related 
dependencies? I'm getting the below error:
   ```
   org.apache.hudi.exception.HoodieIOException: Failed to get instance of 
org.apache.hadoop.fs.FileSystem
     at org.apache.hudi.common.fs.FSUtils.getFs(FSUtils.java:104)
     at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:87)
     at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:69)
     at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:354)
     at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:326)
     at 
org.apache.spark.sql.DataFrameReader.$anonfun$load$3(DataFrameReader.scala:308)
     at scala.Option.getOrElse(Option.scala:189)
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:308)
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:240)
     ... 59 elided
   Caused by: java.io.IOException: No FileSystem for scheme: wasb
     at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2660)
     at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2667)
     at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
     at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
     at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
     at org.apache.hudi.common.fs.FSUtils.getFs(FSUtils.java:102)
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to