imbajin commented on code in PR #571:
URL:
https://github.com/apache/incubator-hugegraph-toolchain/pull/571#discussion_r1474214018
##########
hugegraph-loader/src/main/java/org/apache/hugegraph/loader/reader/hdfs/HDFSFileReader.java:
##########
@@ -109,7 +125,18 @@ protected List<Readable> scanReadables() throws
IOException {
paths.add(new HDFSFile(this.hdfs, path));
} else {
assert this.hdfs.isDirectory(path);
- FileStatus[] statuses = this.hdfs.listStatus(path);
+ FileStatus[] statuses;
+ if (prefix == null || prefix.isEmpty()) {
+ statuses = this.hdfs.listStatus(path);
+ } else {
+ PathFilter prefixFilter = new PathFilter() {
+ @Override
+ public boolean accept(Path path) {
+ return path.getName().startsWith(prefix);
+ }
+ };
+ statuses = this.hdfs.listStatus(path,prefixFilter);
+ }
Review Comment:
maybe we could not list all HDFS_Path together(may occupy large memory) ,
use iterator to scan it?
Add a gist here: (If we only need scan the 1st depth dir)
```java
assert status.isDirectory();
RemoteIterator<FileStatus> iter = this.hdfs.listStatusIterator(path);
while (iter.hasNext()) {
FileStatus subStatus = iter.next();
// check file/dirname StartWith prefiex & passed filter
if ((prefix == null || prefix.isEmpty() ||
subStatus.getPath().getName().startsWith(prefix)) &&
filter.reserved(subStatus.getPath().getName())) {
paths.add(new HDFSFile(this.hdfs, subStatus.getPath()));
}
}
```
if we want to scan dir **recursively**: (maybe better to consider DFS way?
not list all together)
```java
assert this.hdfs.isDirectory(path);
RemoteIterator<FileStatus> iter =
this.hdfs.listStatusIterator(path);
// BFS mode
while (iter.hasNext()) {
FileStatus subStatus = iter.next();
if (subStatus.isFile()) {
if (filter.reserved(subStatus.getPath().getName())) {
paths.add(new HDFSFile(this.hdfs,
subStatus.getPath()));
}
} else {
assert subStatus.isDirectory();
Path[] subPaths =
FileUtil.stat2Paths(this.hdfs.listStatus(subStatus.getPath()));
for (Path subPath : subPaths) {
if (filter.reserved(subPath.getName())) {
paths.add(new HDFSFile(this.hdfs, subPath));
}
}
}
}
```
simple DFS way like:
```java
// DFS scan to improve perf?
private void scanPath(Path path, FileFilter filter, List<Readable> paths)
throws IOException {
FileStatus status = this.hdfs.getFileStatus(path);
if (status.isFile()) {
if ((prefix == null || prefix.isEmpty() ||
path.getName().startsWith(prefix)) &&
filter.reserved(path.getName())) {
paths.add(new HDFSFile(this.hdfs, path));
}
} else {
assert status.isDirectory();
RemoteIterator<FileStatus> iter = this.hdfs.listStatusIterator(path);
while (iter.hasNext()) {
FileStatus subStatus = iter.next();
this.scanPath(subStatus.getPath(), filter, paths);
}
}
}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]