baitian77 opened a new issue, #5564:
URL: https://github.com/apache/gravitino/issues/5564
### Version
0.6.0
### Describe what's wrong
1. failed to execute: hadoop fs -rm
gvfs://fileset/catalog/schema/hdfs_fileset_1/jstack-21727
2. The Hadoop rm command is translated into a mv operation to the trash.
### Error message and/or stacktrace
-rm: Fatal internal error
java.lang.RuntimeException: **Cannot load fileset:
metalake_stg01.user.root..Trash** from the server. exception: Failed to operate
catalog(s) [user] operation [LOAD] under metalake [metalake_stg01], reason
[Catalog metalake_stg01.user does not exist]
org.apache.gravitino.exceptions.NoSuchCatalogException: Catalog
metalake_stg01.user does not exist
at
org.apache.gravitino.catalog.CatalogManager.loadCatalogInternal(CatalogManager.java:649)
...
at
org.apache.gravitino.shaded.com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
at
org.apache.gravitino.shaded.com.github.benmanes.caffeine.cache.LocalManualCache.get(LocalManualCache.java:62)
at
org.apache.gravitino.filesystem.hadoop.GravitinoVirtualFileSystem.getFilesetContext(GravitinoVirtualFileSystem.java:386)
at
org.apache.gravitino.filesystem.hadoop.GravitinoVirtualFileSystem.mkdirs(GravitinoVirtualFileSystem.java:545)
at
org.apache.hadoop.fs.TrashPolicyDefault.moveToTrash(TrashPolicyDefault.java:147)
at org.apache.hadoop.fs.Trash.moveToTrash(Trash.java:110)
at org.apache.hadoop.fs.Trash.moveToAppropriateTrash(Trash.java:96)
at org.apache.hadoop.fs.shell.Delete$Rm.moveToTrash(Delete.java:153)
at org.apache.hadoop.fs.shell.Delete$Rm.processPath(Delete.java:118)
at
org.apache.hadoop.fs.shell.Command.processPathInternal(Command.java:367)
at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:331)
at
org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:304)
at
org.apache.hadoop.fs.shell.Command.processArgument(Command.java:286)
at
org.apache.hadoop.fs.shell.Command.processArguments(Command.java:270)
at
org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:120)
at org.apache.hadoop.fs.shell.Command.run(Command.java:177)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:328)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:391)
### How to reproduce
1. create a hdfs fileset
2. delete the fileset/{sub_file}
### Additional context
_No response_
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]