[ 
https://issues.apache.org/jira/browse/HDDS-1857?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiaoyu Yao resolved HDDS-1857.
------------------------------
    Resolution: Not A Problem

> YARN fails on mapreduce in Kerberos enabled cluster
> ---------------------------------------------------
>
>                 Key: HDDS-1857
>                 URL: https://issues.apache.org/jira/browse/HDDS-1857
>             Project: Hadoop Distributed Data Store
>          Issue Type: Bug
>            Reporter: Eric Yang
>            Assignee: Xiaoyu Yao
>            Priority: Blocker
>
> When configured Ozone as secure cluster, running mapreduce job on secure YARN 
> produces this error message:
> {code}
> 2019-07-23 19:33:12,168 INFO retry.RetryInvocationHandler: 
> com.google.protobuf.ServiceException: java.io.IOException: DestHost:destPort 
> eyang-1.openstacklocal:9862 , LocalHost:localPort 
> eyang-1.openstacklocal/172.26.111.17:0. Failed on local exception: 
> java.io.IOException: Couldn't set up IO streams: 
> java.util.ServiceConfigurationError: org.apache.hadoop.security.SecurityInfo: 
> Provider org.apache.hadoop.yarn.server.RMNMSecurityInfoClass not a subtype, 
> while invoking $Proxy13.submitRequest over 
> nodeId=null,nodeAddress=eyang-1.openstacklocal:9862 after 9 failover 
> attempts. Trying to failover immediately.
> 2019-07-23 19:33:12,174 ERROR ha.OMFailoverProxyProvider: Failed to connect 
> to OM. Attempted 10 retries and 10 failovers
> 2019-07-23 19:33:12,176 ERROR client.OzoneClientFactory: Couldn't create 
> protocol class org.apache.hadoop.ozone.client.rpc.RpcClient exception: 
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>     at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>     at 
> org.apache.hadoop.ozone.client.OzoneClientFactory.getClientProtocol(OzoneClientFactory.java:291)
>     at 
> org.apache.hadoop.ozone.client.OzoneClientFactory.getRpcClient(OzoneClientFactory.java:169)
>     at 
> org.apache.hadoop.fs.ozone.BasicOzoneClientAdapterImpl.<init>(BasicOzoneClientAdapterImpl.java:137)
>     at 
> org.apache.hadoop.fs.ozone.BasicOzoneClientAdapterImpl.<init>(BasicOzoneClientAdapterImpl.java:101)
>     at 
> org.apache.hadoop.fs.ozone.BasicOzoneClientAdapterImpl.<init>(BasicOzoneClientAdapterImpl.java:86)
>     at 
> org.apache.hadoop.fs.ozone.OzoneClientAdapterImpl.<init>(OzoneClientAdapterImpl.java:34)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>     at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>     at 
> org.apache.hadoop.fs.ozone.OzoneClientAdapterFactory.lambda$createAdapter$1(OzoneClientAdapterFactory.java:66)
>     at 
> org.apache.hadoop.fs.ozone.OzoneClientAdapterFactory.createAdapter(OzoneClientAdapterFactory.java:116)
>     at 
> org.apache.hadoop.fs.ozone.OzoneClientAdapterFactory.createAdapter(OzoneClientAdapterFactory.java:62)
>     at 
> org.apache.hadoop.fs.ozone.OzoneFileSystem.createAdapter(OzoneFileSystem.java:98)
>     at 
> org.apache.hadoop.fs.ozone.BasicOzoneFileSystem.initialize(BasicOzoneFileSystem.java:144)
>     at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3338)
>     at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:136)
>     at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3387)
>     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3355)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:497)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:245)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:481)
>     at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
>     at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:352)
>     at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:250)
>     at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:233)
>     at 
> org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:104)
>     at org.apache.hadoop.fs.shell.Command.run(Command.java:177)
>     at org.apache.hadoop.fs.FsShell.run(FsShell.java:327)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>     at org.apache.hadoop.fs.FsShell.main(FsShell.java:390)
> Caused by: java.io.IOException: DestHost:destPort eyang-1.openstacklocal:9862 
> , LocalHost:localPort eyang-1.openstacklocal/172.26.111.17:0. Failed on local 
> exception: java.io.IOException: Couldn't set up IO streams: 
> java.util.ServiceConfigurationError: org.apache.hadoop.security.SecurityInfo: 
> Provider 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.security.LocalizerSecurityInfo
>  not a subtype
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>     at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)
>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:806)
>     at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1515)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1457)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1367)
>     at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
>     at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
>     at com.sun.proxy.$Proxy13.submitRequest(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
>     at com.sun.proxy.$Proxy13.submitRequest(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.apache.hadoop.hdds.tracing.TraceAllMethod.invoke(TraceAllMethod.java:66)
>     at com.sun.proxy.$Proxy13.submitRequest(Unknown Source)
>     at 
> org.apache.hadoop.ozone.om.protocolPB.OzoneManagerProtocolClientSideTranslatorPB.submitRequest(OzoneManagerProtocolClientSideTranslatorPB.java:326)
>     at 
> org.apache.hadoop.ozone.om.protocolPB.OzoneManagerProtocolClientSideTranslatorPB.getServiceList(OzoneManagerProtocolClientSideTranslatorPB.java:1155)
>     at 
> org.apache.hadoop.ozone.client.rpc.RpcClient.getScmAddressForClient(RpcClient.java:234)
>     at org.apache.hadoop.ozone.client.rpc.RpcClient.<init>(RpcClient.java:156)
>     ... 36 more
> Caused by: java.io.IOException: Couldn't set up IO streams: 
> java.util.ServiceConfigurationError: org.apache.hadoop.security.SecurityInfo: 
> Provider 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.security.LocalizerSecurityInfo
>  not a subtype
>     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:866)
>     at org.apache.hadoop.ipc.Client$Connection.access$3700(Client.java:411)
>     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1572)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1403)
>     ... 60 more
> Caused by: java.util.ServiceConfigurationError: 
> org.apache.hadoop.security.SecurityInfo: Provider 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.security.LocalizerSecurityInfo
>  not a subtype
>     at java.util.ServiceLoader.fail(ServiceLoader.java:239)
>     at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
>     at 
> java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
>     at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
>     at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
>     at 
> org.apache.hadoop.security.SecurityUtil.getTokenInfo(SecurityUtil.java:399)
>     at 
> org.apache.hadoop.security.SaslRpcClient.getServerToken(SaslRpcClient.java:267)
>     at 
> org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:219)
>     at 
> org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:160)
>     at 
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)
>     at 
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:617)
>     at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:411)
>     at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:804)
>     at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:800)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:800)
>     ... 63 more
> 2019-07-23 19:33:12,182 ERROR ozone.OzoneClientAdapterFactory: Can't 
> initialize the ozoneClientAdapter
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>     at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>     at 
> org.apache.hadoop.fs.ozone.OzoneClientAdapterFactory.lambda$createAdapter$1(OzoneClientAdapterFactory.java:66)
>     at 
> org.apache.hadoop.fs.ozone.OzoneClientAdapterFactory.createAdapter(OzoneClientAdapterFactory.java:116)
>     at 
> org.apache.hadoop.fs.ozone.OzoneClientAdapterFactory.createAdapter(OzoneClientAdapterFactory.java:62)
>     at 
> org.apache.hadoop.fs.ozone.OzoneFileSystem.createAdapter(OzoneFileSystem.java:98)
>     at 
> org.apache.hadoop.fs.ozone.BasicOzoneFileSystem.initialize(BasicOzoneFileSystem.java:144)
>     at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3338)
>     at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:136)
>     at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3387)
>     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3355)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:497)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:245)
>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:481)
>     at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
>     at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:352)
>     at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:250)
>     at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:233)
>     at 
> org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:104)
>     at org.apache.hadoop.fs.shell.Command.run(Command.java:177)
>     at org.apache.hadoop.fs.FsShell.run(FsShell.java:327)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
>     at org.apache.hadoop.fs.FsShell.main(FsShell.java:390)
> Caused by: java.io.IOException: DestHost:destPort eyang-1.openstacklocal:9862 
> , LocalHost:localPort eyang-1.openstacklocal/172.26.111.17:0. Failed on local 
> exception: java.io.IOException: Couldn't set up IO streams: 
> java.util.ServiceConfigurationError: org.apache.hadoop.security.SecurityInfo: 
> Provider 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.security.LocalizerSecurityInfo
>  not a subtype
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>     at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)
>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:806)
>     at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1515)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1457)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1367)
>     at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
>     at 
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
>     at com.sun.proxy.$Proxy13.submitRequest(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
>     at 
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
>     at com.sun.proxy.$Proxy13.submitRequest(Unknown Source)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at 
> org.apache.hadoop.hdds.tracing.TraceAllMethod.invoke(TraceAllMethod.java:66)
>     at com.sun.proxy.$Proxy13.submitRequest(Unknown Source)
>     at 
> org.apache.hadoop.ozone.om.protocolPB.OzoneManagerProtocolClientSideTranslatorPB.submitRequest(OzoneManagerProtocolClientSideTranslatorPB.java:326)
>     at 
> org.apache.hadoop.ozone.om.protocolPB.OzoneManagerProtocolClientSideTranslatorPB.getServiceList(OzoneManagerProtocolClientSideTranslatorPB.java:1155)
>     at 
> org.apache.hadoop.ozone.client.rpc.RpcClient.getScmAddressForClient(RpcClient.java:234)
>     at org.apache.hadoop.ozone.client.rpc.RpcClient.<init>(RpcClient.java:156)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>     at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>     at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>     at 
> org.apache.hadoop.ozone.client.OzoneClientFactory.getClientProtocol(OzoneClientFactory.java:291)
>     at 
> org.apache.hadoop.ozone.client.OzoneClientFactory.getRpcClient(OzoneClientFactory.java:169)
>     at 
> org.apache.hadoop.fs.ozone.BasicOzoneClientAdapterImpl.<init>(BasicOzoneClientAdapterImpl.java:137)
>     at 
> org.apache.hadoop.fs.ozone.BasicOzoneClientAdapterImpl.<init>(BasicOzoneClientAdapterImpl.java:101)
>     at 
> org.apache.hadoop.fs.ozone.BasicOzoneClientAdapterImpl.<init>(BasicOzoneClientAdapterImpl.java:86)
>     at 
> org.apache.hadoop.fs.ozone.OzoneClientAdapterImpl.<init>(OzoneClientAdapterImpl.java:34)
>     ... 26 more
> Caused by: java.io.IOException: Couldn't set up IO streams: 
> java.util.ServiceConfigurationError: org.apache.hadoop.security.SecurityInfo: 
> Provider 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.security.LocalizerSecurityInfo
>  not a subtype
>     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:866)
>     at org.apache.hadoop.ipc.Client$Connection.access$3700(Client.java:411)
>     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1572)
>     at org.apache.hadoop.ipc.Client.call(Client.java:1403)
>     ... 60 more
> Caused by: java.util.ServiceConfigurationError: 
> org.apache.hadoop.security.SecurityInfo: Provider 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.security.LocalizerSecurityInfo
>  not a subtype
>     at java.util.ServiceLoader.fail(ServiceLoader.java:239)
>     at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
>     at 
> java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
>     at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
>     at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
>     at 
> org.apache.hadoop.security.SecurityUtil.getTokenInfo(SecurityUtil.java:399)
>     at 
> org.apache.hadoop.security.SaslRpcClient.getServerToken(SaslRpcClient.java:267)
>     at 
> org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:219)
>     at 
> org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:160)
>     at 
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)
>     at 
> org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:617)
>     at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:411)
>     at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:804)
>     at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:800)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:800)
>     ... 63 more
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org

Reply via email to