imbajin commented on code in PR #571:
URL:
https://github.com/apache/incubator-hugegraph-toolchain/pull/571#discussion_r1474214018
##########
hugegraph-loader/src/main/java/org/apache/hugegraph/loader/reader/hdfs/HDFSFileReader.java:
##########
@@ -109,7 +125,18 @@ protected List<Readable> scanReadables() throws
IOException {
paths.add(new HDFSFile(this.hdfs, path));
} else {
assert this.hdfs.isDirectory(path);
- FileStatus[] statuses = this.hdfs.listStatus(path);
+ FileStatus[] statuses;
+ if (prefix == null || prefix.isEmpty()) {
+ statuses = this.hdfs.listStatus(path);
+ } else {
+ PathFilter prefixFilter = new PathFilter() {
+ @Override
+ public boolean accept(Path path) {
+ return path.getName().startsWith(prefix);
+ }
+ };
+ statuses = this.hdfs.listStatus(path,prefixFilter);
+ }
Review Comment:
maybe we could not list all HDFS_Path together(may occupy large memory) ,
use iterator to scan it?
Add a gist here:
```java
assert this.hdfs.isDirectory(path);
RemoteIterator<FileStatus> iter =
this.hdfs.listStatusIterator(path);
while (iter.hasNext()) {
FileStatus subStatus = iter.next();
if (subStatus.isFile()) {
if (filter.reserved(subStatus.getPath().getName())) {
paths.add(new HDFSFile(this.hdfs,
subStatus.getPath()));
}
} else {
assert subStatus.isDirectory();
Path[] subPaths =
FileUtil.stat2Paths(this.hdfs.listStatus(subStatus.getPath()));
for (Path subPath : subPaths) {
if (filter.reserved(subPath.getName())) {
paths.add(new HDFSFile(this.hdfs, subPath));
}
}
}
}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]