See https://builds.apache.org/job/Hadoop-Yarn-trunk-Java8/1065/
################################################################################### ########################## LAST 60 LINES OF THE CONSOLE ########################### [...truncated 5696 lines...] TestApplicationClientProtocolOnHA.initiate:54->ProtocolHATestBase.startHACluster:276 » YarnRuntime TestApplicationClientProtocolOnHA>ProtocolHATestBase.teardown:192 » NullPointer TestApplicationClientProtocolOnHA.initiate:54->ProtocolHATestBase.startHACluster:276 » YarnRuntime TestApplicationClientProtocolOnHA>ProtocolHATestBase.teardown:192 » NullPointer TestApplicationClientProtocolOnHA.initiate:54->ProtocolHATestBase.startHACluster:276 » YarnRuntime TestApplicationClientProtocolOnHA>ProtocolHATestBase.teardown:192 » NullPointer TestApplicationClientProtocolOnHA.initiate:54->ProtocolHATestBase.startHACluster:276 » YarnRuntime TestApplicationClientProtocolOnHA>ProtocolHATestBase.teardown:192 » NullPointer Tests run: 187, Failures: 0, Errors: 10, Skipped: 0 [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop YARN ................................ SUCCESS [ 3.277 s] [INFO] Apache Hadoop YARN API ............................ SUCCESS [01:22 min] [INFO] Apache Hadoop YARN Common ......................... SUCCESS [02:26 min] [INFO] Apache Hadoop YARN Server ......................... SUCCESS [ 0.060 s] [INFO] Apache Hadoop YARN Server Common .................. SUCCESS [ 37.812 s] [INFO] Apache Hadoop YARN NodeManager .................... SUCCESS [09:48 min] [INFO] Apache Hadoop YARN Web Proxy ...................... SUCCESS [ 18.063 s] [INFO] Apache Hadoop YARN ApplicationHistoryService ...... SUCCESS [03:21 min] [INFO] Apache Hadoop YARN ResourceManager ................ SUCCESS [ 01:08 h] [INFO] Apache Hadoop YARN Server Tests ................... SUCCESS [02:26 min] [INFO] Apache Hadoop YARN Client ......................... FAILURE [08:11 min] [INFO] Apache Hadoop YARN SharedCacheManager ............. SKIPPED [INFO] Apache Hadoop YARN Timeline Plugin Storage ........ SKIPPED [INFO] Apache Hadoop YARN Applications ................... SKIPPED [INFO] Apache Hadoop YARN DistributedShell ............... SKIPPED [INFO] Apache Hadoop YARN Unmanaged Am Launcher .......... SKIPPED [INFO] Apache Hadoop YARN Site ........................... SKIPPED [INFO] Apache Hadoop YARN Registry ....................... SKIPPED [INFO] Apache Hadoop YARN Project ........................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 01:36 h [INFO] Finished at: 2016-02-17T07:15:19+00:00 [INFO] Final Memory: 80M/989M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-yarn-client: There are test failures. [ERROR] [ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Yarn-trunk-Java8/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-client/target/surefire-reports for the individual test results. [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :hadoop-yarn-client Build step 'Execute shell' marked build as failure Archiving artifacts Recording test results Sending e-mails to: [email protected] Email was triggered for: Failure - Any Sending email for trigger: Failure - Any ################################################################################### ############################## FAILED TESTS (if any) ############################## 10 tests failed. FAILED: org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetDelegationTokenOnHA Error Message: java.lang.reflect.InvocationTargetException Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.reflect.InvocationTargetException at java.net.URLClassLoader$1.run(URLClassLoader.java:372) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:360) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at org.apache.hadoop.security.proto.SecurityProtos$GetDelegationTokenRequestProto.newBuilder(SecurityProtos.java:1208) at org.apache.hadoop.yarn.api.protocolrecords.impl.pb.GetDelegationTokenRequestPBImpl.<init>(GetDelegationTokenRequestPBImpl.java:40) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:408) at org.apache.hadoop.yarn.factories.impl.pb.RecordFactoryPBImpl.newRecordInstance(RecordFactoryPBImpl.java:70) at org.apache.hadoop.yarn.util.Records.newRecord(Records.java:36) at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getRMDelegationToken(YarnClientImpl.java:541) at org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetDelegationTokenOnHA(TestApplicationClientProtocolOnHA.java:194) FAILED: org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetApplicationsOnHA Error Message: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:149) at org.apache.hadoop.yarn.client.ProtocolHATestBase$MiniYARNClusterForHATesting.<init>(ProtocolHATestBase.java:310) at org.apache.hadoop.yarn.client.ProtocolHATestBase.startHACluster(ProtocolHATestBase.java:276) at org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.initiate(TestApplicationClientProtocolOnHA.java:54) FAILED: org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetApplicationsOnHA Error Message: null Stack Trace: java.lang.NullPointerException: null at org.apache.hadoop.yarn.client.ProtocolHATestBase.teardown(ProtocolHATestBase.java:192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33) at org.junit.rules.TestWatchman$1.evaluate(TestWatchman.java:53) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) FAILED: org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetClusterNodesOnHA Error Message: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:149) at org.apache.hadoop.yarn.client.ProtocolHATestBase$MiniYARNClusterForHATesting.<init>(ProtocolHATestBase.java:310) at org.apache.hadoop.yarn.client.ProtocolHATestBase.startHACluster(ProtocolHATestBase.java:276) at org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.initiate(TestApplicationClientProtocolOnHA.java:54) FAILED: org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetClusterNodesOnHA Error Message: null Stack Trace: java.lang.NullPointerException: null at org.apache.hadoop.yarn.client.ProtocolHATestBase.teardown(ProtocolHATestBase.java:192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33) at org.junit.rules.TestWatchman$1.evaluate(TestWatchman.java:53) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) FAILED: org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetContainerReportOnHA Error Message: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:149) at org.apache.hadoop.yarn.client.ProtocolHATestBase$MiniYARNClusterForHATesting.<init>(ProtocolHATestBase.java:310) at org.apache.hadoop.yarn.client.ProtocolHATestBase.startHACluster(ProtocolHATestBase.java:276) at org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.initiate(TestApplicationClientProtocolOnHA.java:54) FAILED: org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetContainerReportOnHA Error Message: null Stack Trace: java.lang.NullPointerException: null at org.apache.hadoop.yarn.client.ProtocolHATestBase.teardown(ProtocolHATestBase.java:192) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33) at org.junit.rules.TestWatchman$1.evaluate(TestWatchman.java:53) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) FAILED: org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetClusterMetricsOnHA Error Message: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161) at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332) at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1743) at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446) at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423) at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409) at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:149) at org.apache.hadoop.yarn.client.ProtocolHATestBase$MiniYARNClusterForHATesting.<init>(ProtocolHATestBase.java:310) at org.apache.hadoop.yarn.client.ProtocolHATestBase.startHACluster(ProtocolHATestBase.java:276) at org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.initiate(TestApplicationClientProtocolOnHA.java:54) FAILED: org.apache.hadoop.yarn.client.TestApplicationClientProtocolOnHA.testGetClusterMetricsOnHA Error Message: null Stack Trace: java.lang.NullPointerException: null at org.apache.hadoop.yarn.client.ProtocolHATestBase.teardown(ProtocolHATestBase.java:192) at sun.reflect.GeneratedMethodAccessor56.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33) at org.junit.rules.TestWatchman$1.evaluate(TestWatchman.java:53) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) at org.junit.runners.ParentRunner.run(ParentRunner.java:309) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) FAILED: org.apache.hadoop.yarn.client.TestHedgingRequestRMFailoverProxyProvider.testHedgingRequestProxyProvider Error Message: java.io.IOException: ResourceManager failed to start. Final state is STOPPED Stack Trace: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.io.IOException: ResourceManager failed to start. Final state is STOPPED at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:334) at org.apache.hadoop.yarn.server.MiniYARNCluster.access$400(MiniYARNCluster.java:100) at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:458) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) at org.apache.hadoop.yarn.client.TestHedgingRequestRMFailoverProxyProvider.testHedgingRequestProxyProvider(TestHedgingRequestRMFailoverProxyProvider.java:57)
