[ 
https://issues.apache.org/jira/browse/MAPREDUCE-7442?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jiandan Yang  reassigned MAPREDUCE-7442:
----------------------------------------

    Assignee:     (was: Jiandan Yang )

> exception message is not intusive when accessing the job configuration web UI
> -----------------------------------------------------------------------------
>
>                 Key: MAPREDUCE-7442
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-7442
>             Project: Hadoop Map/Reduce
>          Issue Type: Bug
>          Components: applicationmaster
>         Environment: 
>            Reporter: Jiandan Yang 
>            Priority: Major
>         Attachments: image-2023-07-14-11-23-10-762.png
>
>
> I launched a Teragen job on hadoop-3.3.4 cluster. 
> The web occured an error when I clicked the link of Configuration of Job. The 
> error page said "HTTP ERROR 500 java.lang.IllegalArgumentException: RFC6265 
> Cookie values may not contain character: [ ]", and I can't find any solution 
> by this error message.
> I found some additional stacks in the log of AM, and those stacks reflect 
> yarn did not have the permission of stagging directory. When I give 
> permission to yarn I can access configuration page.
> I think the problem is that the error page does not provide useful or 
> meaningful prompts.
> It's better if there are  message about "yarn does not have hdfs permission" 
> in the error page.
> The snapshot of error page is as follows:
> !image-2023-07-14-11-23-10-762.png!
> The error logs of am are as folllows:
> {code:java}
> 2023-07-14 11:20:08,218 ERROR [qtp1379757019-43] 
> org.apache.hadoop.yarn.webapp.View: Error while reading 
> hdfs://dmp/user/ubd_dmp_test/.staging/job_1689296289020_0006/job.xml
> org.apache.hadoop.security.AccessControlException: Permission denied: 
> user=yarn, access=EXECUTE, 
> inode="/user/ubd_dmp_test/.staging":ubd_dmp_test:ubd_dmp_test:drwx------
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:506)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:422)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:333)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermissionWithContext(FSPermissionChecker.java:370)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:240)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:713)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1892)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1910)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.resolvePath(FSDirectory.java:727)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java:154)
>       at 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:2089)
>       at 
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:762)
>       at 
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:458)
>       at 
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:604)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:572)
>       at 
> org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:556)
>       at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1093)
>       at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1043)
>       at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:971)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:422)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
>       at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2976)
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>       at 
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:121)
>       at 
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:88)
>       at 
> org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:902)
>       at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:889)
>       at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:878)
>       at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1046)
>       at org.apache.hadoop.fs.Hdfs.open(Hdfs.java:373)
>       at org.apache.hadoop.fs.Hdfs.open(Hdfs.java:60)
>       at 
> org.apache.hadoop.fs.AbstractFileSystem.open(AbstractFileSystem.java:670)
>       at org.apache.hadoop.fs.FileContext$6.next(FileContext.java:874)
>       at org.apache.hadoop.fs.FileContext$6.next(FileContext.java:870)
>       at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
>       at org.apache.hadoop.fs.FileContext.open(FileContext.java:876)
>       at 
> org.apache.hadoop.mapreduce.v2.app.job.impl.JobImpl.loadConfFile(JobImpl.java:2287)
>       at 
> org.apache.hadoop.mapreduce.v2.app.webapp.dao.ConfInfo.<init>(ConfInfo.java:45)
>       at 
> org.apache.hadoop.mapreduce.v2.app.webapp.ConfBlock.render(ConfBlock.java:71)
>       at 
> org.apache.hadoop.yarn.webapp.view.HtmlBlock.render(HtmlBlock.java:69)
>       at 
> org.apache.hadoop.yarn.webapp.view.HtmlBlock.renderPartial(HtmlBlock.java:79)
>       at org.apache.hadoop.yarn.webapp.View.render(View.java:243)
>       at 
> org.apache.hadoop.yarn.webapp.view.HtmlPage$Page.subView(HtmlPage.java:49)
>       at 
> org.apache.hadoop.yarn.webapp.hamlet2.HamletImpl$EImp._v(HamletImpl.java:117)
>       at org.apache.hadoop.yarn.webapp.hamlet2.Hamlet$TD.__(Hamlet.java:848)
>       at 
> org.apache.hadoop.yarn.webapp.view.TwoColumnLayout.render(TwoColumnLayout.java:71)
>       at org.apache.hadoop.yarn.webapp.view.HtmlPage.render(HtmlPage.java:82)
>       at org.apache.hadoop.yarn.webapp.Controller.render(Controller.java:216)
>       at 
> org.apache.hadoop.mapreduce.v2.app.webapp.AppController.conf(AppController.java:324)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at org.apache.hadoop.yarn.webapp.Dispatcher.service(Dispatcher.java:171)
>       at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
>       at 
> com.google.inject.servlet.ServletDefinition.doServiceImpl(ServletDefinition.java:287)
>       at 
> com.google.inject.servlet.ServletDefinition.doService(ServletDefinition.java:277)
>       at 
> com.google.inject.servlet.ServletDefinition.service(ServletDefinition.java:182)
>       at 
> com.google.inject.servlet.ManagedServletPipeline.service(ManagedServletPipeline.java:91)
>       at 
> com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:85)
>       at 
> com.sun.jersey.spi.container.servlet.ServletContainer.doFilter(ServletContainer.java:941)
>       at 
> com.sun.jersey.spi.container.servlet.ServletContainer.doFilter(ServletContainer.java:875)
>       at 
> com.sun.jersey.spi.container.servlet.ServletContainer.doFilter(ServletContainer.java:829)
>       at 
> com.google.inject.servlet.FilterChainInvocation.doFilter(FilterChainInvocation.java:82)
>       at 
> com.google.inject.servlet.ManagedFilterPipeline.dispatch(ManagedFilterPipeline.java:119)
>       at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:133)
>       at com.google.inject.servlet.GuiceFilter$1.call(GuiceFilter.java:130)
>       at 
> com.google.inject.servlet.GuiceFilter$Context.call(GuiceFilter.java:203)
>       at com.google.inject.servlet.GuiceFilter.doFilter(GuiceFilter.java:130)
>       at 
> org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
>       at 
> org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
>       at 
> org.apache.hadoop.security.http.XFrameOptionsFilter.doFilter(XFrameOptionsFilter.java:57)
>       at 
> org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
>       at 
> org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
>       at 
> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.doFilter(AmIpFilter.java:179)
>       at 
> org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
>       at 
> org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
>       at 
> org.apache.hadoop.http.HttpServer2$QuotingInputFilter.doFilter(HttpServer2.java:1764)
>       at 
> org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
>       at 
> org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
>       at org.apache.hadoop.http.NoCacheFilter.doFilter(NoCacheFilter.java:45)
>       at 
> org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
>       at 
> org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
>       at 
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:548)
>       at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
>       at 
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:602)
>       at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
>       at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
>       at 
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
>       at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
>       at 
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1434)
>       at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
>       at 
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)
>       at 
> org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
>       at 
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
>       at 
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1349)
>       at 
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
>       at 
> org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
>       at 
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
>       at org.eclipse.jetty.server.Server.handle(Server.java:516)
>       at 
> org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:388)
>       at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:633)
>       at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:380)
>       at 
> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
>       at 
> org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
>       at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
>       at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
>       at 
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
>       at 
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
>       at 
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
>       at 
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
>       at 
> org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:386)
>       at 
> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
>       at 
> org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
>       at java.lang.Thread.run(Thread.java:748)
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: mapreduce-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: mapreduce-issues-h...@hadoop.apache.org

Reply via email to