colinmjj commented on PR #53:
URL: https://github.com/apache/incubator-uniffle/pull/53#issuecomment-1182810234

   
   
   
   > @colinmjj @jerqi
   > 
   > > For spark client, it can depend on spark's implementation, and read data 
according to delegation token.
   > 
   > yes. There is no need to retrieve fs by ugi proxy user. The credentials 
have been fetched in spark driver side when starting.
   > 
   > > For Shuffle server, Hadoop conf can be updated with security enable when 
write data to HDFS
   > > What's the advantage to add HadoopAccessorProvider?
   > 
   > Now shuffle server can’t access secured cluster due to lacking keytab.
   > 
   > To solve this, i introduced some config and then shuffle server can login 
and delegate user to write hdfs files by using the proxy user.
   
   Have you try the following way to update Hadoop configuration:
   1. Option 1, in Shuffle server, conf prefix with `rss.server.hadoop` will be 
added to Hadoop configuration for data writing.
   2. Option 2, Coordinator can manage the information for different Hdfs, you 
can check the related API `rpc fetchRemoteStorage(FetchRemoteStorageRequest)` 
for more detail.
   
   For the keytab problem, do you mean it can't be existed in Shuffle Server?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to