lvchongyi opened a new issue #5872:
URL: https://github.com/apache/dolphinscheduler/issues/5872


   DolphinScheduler version:
   1.3.6
   
   common.properties config:
   hadoop.security.authentication.startup.state=true
   java.security.krb5.conf.path=/etc/krb5.conf
   [email protected]
   login.user.keytab.path=/etc/security/keytabs/dolphinscheduler.keytab
   
   detail:
   my mapreduce task failed,
   why?
   
   error:
   [INFO] 2021-07-21 16:11:35.930  - [taskAppId=TASK-4-14-25]:[115] - create 
dir success /usr/lib/dolphinscheduler/data/exec/process/1/4/14/25
   [INFO] 2021-07-21 16:11:38.903  - [taskAppId=TASK-4-14-25]:[75] - mapreduce 
task params {"mainArgs":"2 
4","programType":"JAVA","mainClass":"pi","appName":"PI","mainJar":{"id":4},"localParams":[],"others":"","resourceList":[]}
   [INFO] 2021-07-21 16:11:40.117  - [taskAppId=TASK-4-14-25]:[119] - mapreduce 
task command: hadoop jar mapreduce/hadoop-mapreduce-examples-3.2.1.jar pi 
-Dmapreduce.job.name=PI -Dmapreduce.job.queuename=default 2 4
   [INFO] 2021-07-21 16:11:40.118  - [taskAppId=TASK-4-14-25]:[87] - tenantCode 
user:newland, task dir:4_14_25
   [INFO] 2021-07-21 16:11:40.119  - [taskAppId=TASK-4-14-25]:[92] - create 
command 
file:/usr/lib/dolphinscheduler/data/exec/process/1/4/14/25/4_14_25.command
   [INFO] 2021-07-21 16:11:40.120  - [taskAppId=TASK-4-14-25]:[111] - command : 
#!/bin/sh
   BASEDIR=$(cd `dirname $0`; pwd)
   cd $BASEDIR
   source /usr/lib/dolphinscheduler/conf/env/dolphinscheduler_env.sh
   hadoop jar mapreduce/hadoop-mapreduce-examples-3.2.1.jar pi 
-Dmapreduce.job.name=PI -Dmapreduce.job.queuename=default 2 4
   [INFO] 2021-07-21 16:11:40.130  - [taskAppId=TASK-4-14-25]:[327] - task run 
command:
   sudo -u newland sh 
/usr/lib/dolphinscheduler/data/exec/process/1/4/14/25/4_14_25.command
   [INFO] 2021-07-21 16:11:40.141  - [taskAppId=TASK-4-14-25]:[208] - process 
start, process id is: 19743
   [INFO] 2021-07-21 16:11:41.165  - [taskAppId=TASK-4-14-25]:[129] -  -> 
SLF4J: Class path contains multiple SLF4J bindings.
        SLF4J: Found binding in 
[jar:file:/usr/lib/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
        SLF4J: Found binding in 
[jar:file:/usr/lib/tez/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
        SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
        SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
   [INFO] 2021-07-21 16:11:42.166  - [taskAppId=TASK-4-14-25]:[129] -  -> 
Number of Maps  = 2
        Samples per Map = 4
   [INFO] 2021-07-21 16:11:43.994  - [taskAppId=TASK-4-14-25]:[217] - process 
has exited, execute path:/usr/lib/dolphinscheduler/data/exec/process/1/4/14/25, 
processId:19743 ,exitStatusCode:0
   [INFO] 2021-07-21 16:11:44.169  - [taskAppId=TASK-4-14-25]:[129] -  -> 
21/07/21 16:11:43 WARN ipc.Client: Exception encountered while connecting to 
the server : org.apache.hadoop.security.AccessControlException: Client cannot 
authenticate via:[TOKEN, KERBEROS]
        java.io.IOException: DestHost:destPort vdapp117:8020 , 
LocalHost:localPort vdapp118/172.32.148.87:0. Failed on local exception: 
java.io.IOException: org.apache.hadoop.security.AccessControlException: Client 
cannot authenticate via:[TOKEN, KERBEROS]
                at 
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
                at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
                at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
                at 
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
                at 
org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:833)
                at 
org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:808)
                at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1549)
                at org.apache.hadoop.ipc.Client.call(Client.java:1491)
                at org.apache.hadoop.ipc.Client.call(Client.java:1388)
                at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)
                at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
                at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
                at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:907)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
                at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
                at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
                at 
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
                at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
                at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
                at 
org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1666)
                at 
org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1576)
                at 
org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1573)
                at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
                at 
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1588)
                at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1683)
                at 
org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:279)
                at 
org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:360)
                at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
                at 
org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:368)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at 
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
                at 
org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
                at 
org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
                at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
                at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
                at java.lang.reflect.Method.invoke(Method.java:498)
                at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
                at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
        Caused by: java.io.IOException: 
org.apache.hadoop.security.AccessControlException: Client cannot authenticate 
via:[TOKEN, KERBEROS]
                at 
org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:770)
                at java.security.AccessController.doPrivileged(Native Method)
                at javax.security.auth.Subject.doAs(Subject.java:422)
                at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
                at 
org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:733)
                at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:827)
                at 
org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:421)
                at org.apache.hadoop.ipc.Client.getConnection(Client.java:1606)
                at org.apache.hadoop.ipc.Client.call(Client.java:1435)
                ... 38 more
        Caused by: org.apache.hadoop.security.AccessControlException: Client 
cannot authenticate via:[TOKEN, KERBEROS]
                at 
org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:173)
                at 
org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)
                at 
org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:627)
                at 
org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:421)
                at 
org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:814)
                at 
org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:810)
                at java.security.AccessController.doPrivileged(Native Method)
                at javax.security.auth.Subject.doAs(Subject.java:422)
                at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
                at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:810)
                ... 41 more
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to