Andywli commented on issue #1491:
URL: 
https://github.com/apache/incubator-linkis/issues/1491#issuecomment-1041202557


   The FileSystem that obtains Hadoop in linkis is implemented through the 
HDFSUtils class, so we put kerberos in this class, and users can see the logic 
of this class. The currently supported login modes are as follows:
   
   ```
   if(KERBEROS_ENABLE.getValue) {
         val path = new File(
         TAB_FILE.getValue , userName + ".keytab").getPath
         val user = getKerberosUser(userName)
         UserGroupInformation.setConfiguration(getConfiguration(userName))
         UserGroupInformation.loginUserFromKeytabAndReturnUGI(user, path)
       } else {
         UserGroupInformation.createRemoteUser(userName)
       }
   ```
   Users only need to configure the following parameters in the configuration 
file linkis.properties:
   
   ```
   wds.linkis.keytab.enable=true
   wds.linkis.keytab.file=/appcom/keytab/ 
#keytab放置目录,该目录存储的是多个用户的username.keytab的文件
   wds.linkis.keytab.host.enabled=false #是否带上principle客户端认证
   wds.linkis.keytab.host=127.0.0.1 #principle认证需要带上的客户端IP
   ```
   If the user needs to perform tasks, a user with the corresponding user name 
needs to be created on the Linux server. If it is a standard version, the user 
needs to be able to execute Spark and hive tasks, and a corresponding user 
needs to be created in the local workspace and the HDFS directory /tmp/linkis 
name directory.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to