NickyYe commented on a change in pull request #2080:
URL: https://github.com/apache/hadoop/pull/2080#discussion_r443081834
##########
File path:
hadoop-hdfs-project/hadoop-hdfs-rbf/src/main/java/org/apache/hadoop/hdfs/server/federation/router/RouterWebHdfsMethods.java
##########
@@ -454,19 +454,12 @@ private URI redirectURI(final Router router, final
UserGroupInformation ugi,
private DatanodeInfo chooseDatanode(final Router router,
final String path, final HttpOpParam.Op op, final long openOffset,
final String excludeDatanodes) throws IOException {
- // We need to get the DNs as a privileged user
final RouterRpcServer rpcServer = getRPCServer(router);
- UserGroupInformation loginUser = UserGroupInformation.getLoginUser();
- RouterRpcServer.setCurrentUser(loginUser);
-
DatanodeInfo[] dns = null;
try {
- dns = rpcServer.getDatanodeReport(DatanodeReportType.LIVE);
+ dns = rpcServer.getCachedDatanodeReport(DatanodeReportType.LIVE);
Review comment:
You are correct. I found this issue also. We could be redirected to a
datanode from a different subcluster where the path is acutually mounted, but
the file would be written to the correct subcluster. We may need to open a new
JIRA for this.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]