[ 
https://issues.apache.org/jira/browse/MAPREDUCE-7434?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

ASF GitHub Bot updated MAPREDUCE-7434:
--------------------------------------
    Labels: pull-request-available  (was: )

> Fix testFailure TestShuffleHandler.testMapFileAccess
> ----------------------------------------------------
>
>                 Key: MAPREDUCE-7434
>                 URL: https://issues.apache.org/jira/browse/MAPREDUCE-7434
>             Project: Hadoop Map/Reduce
>          Issue Type: Bug
>    Affects Versions: 3.4.0
>            Reporter: Tamas Domok
>            Assignee: Tamas Domok
>            Priority: Major
>              Labels: pull-request-available
>
> https://ci-hadoop.apache.org/view/Hadoop/job/hadoop-qbt-trunk-java8-linux-x86_64/1143/testReport/junit/org.apache.hadoop.mapred/TestShuffleHandler/testMapFileAccess/
> {code}
> Error Message
> Server returned HTTP response code: 500 for URL: 
> http://127.0.0.1:13562/mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0
> Stacktrace
> java.io.IOException: Server returned HTTP response code: 500 for URL: 
> http://127.0.0.1:13562/mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0
>       at 
> sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1902)
>       at 
> sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1500)
>       at 
> org.apache.hadoop.mapred.TestShuffleHandler.testMapFileAccess(TestShuffleHandler.java:292)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:498)
>       at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
>       at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>       at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
>       at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>       at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299)
>       at 
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293)
>       at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>       at java.lang.Thread.run(Thread.java:750)
> Standard Output
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory - 
> field org.apache.hadoop.metrics2.lib.MutableGaugeInt 
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleConnections 
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, 
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[# 
> of current shuffle connections])
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory - 
> field org.apache.hadoop.metrics2.lib.MutableCounterLong 
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputBytes 
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, 
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, 
> value=[Shuffle output in bytes])
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory - 
> field org.apache.hadoop.metrics2.lib.MutableCounterInt 
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputsFailed 
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, 
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[# 
> of failed shuffle outputs])
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory - 
> field org.apache.hadoop.metrics2.lib.MutableCounterInt 
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputsOK with 
> annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, 
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[# 
> of succeeeded shuffle outputs])
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.impl.MetricsSystemImpl - 
> ShuffleMetrics, Shuffle output metrics
> 12:04:17.467 [Time-limited test] DEBUG o.a.hadoop.service.AbstractService - 
> Service: mapreduce_shuffle entered state INITED
> 12:04:17.477 [Time-limited test] DEBUG o.a.hadoop.service.AbstractService - 
> Config has been overridden during init
> 12:04:17.478 [Time-limited test] INFO  org.apache.hadoop.mapred.IndexCache - 
> IndexCache created with max memory = 10485760
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory - 
> field org.apache.hadoop.metrics2.lib.MutableGaugeInt 
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleConnections 
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, 
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[# 
> of current shuffle connections])
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory - 
> field org.apache.hadoop.metrics2.lib.MutableCounterLong 
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputBytes 
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, 
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, 
> value=[Shuffle output in bytes])
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory - 
> field org.apache.hadoop.metrics2.lib.MutableCounterInt 
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputsFailed 
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, 
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[# 
> of failed shuffle outputs])
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory - 
> field org.apache.hadoop.metrics2.lib.MutableCounterInt 
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputsOK with 
> annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, 
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[# 
> of succeeeded shuffle outputs])
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.impl.MetricsSystemImpl - 
> ShuffleMetrics, Shuffle output metrics
> 12:04:17.482 [Time-limited test] INFO  o.a.hadoop.mapred.ShuffleHandler - 
> mapreduce_shuffle listening on port 13562
> 12:04:17.482 [Time-limited test] DEBUG o.a.hadoop.service.AbstractService - 
> Service mapreduce_shuffle is started
> 12:04:17.483 [Time-limited test] INFO  o.a.hadoop.mapred.ShuffleHandler - 
> Added token for job_1111111111111_0001
> 12:04:17.486 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - ShuffleChannelInitializer init; 
> channel='a9df992f'
> 12:04:17.487 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - Executing channelActive; channel='a9df992f'
> 12:04:17.487 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - Added channel: [id: 0xa9df992f, 
> L:/127.0.0.1:13562 - R:/127.0.0.1:53014], channel id: a9df992f. Accepted 
> number of connections=1
> 12:04:17.489 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - Received HTTP request: 
> HttpObjectAggregator$AggregatedFullHttpRequest(decodeResult: success, 
> version: HTTP/1.1, content: CompositeByteBuf(ridx: 0, widx: 0, cap: 0, 
> components=0))
> GET 
> /mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0
>  HTTP/1.1
> name: mapreduce
> version: 1.0.0
> UrlHash: +OVYyZ3+ni316LsqZvZcFqgV/H8=
> User-Agent: Java/1.8.0_352
> Host: 127.0.0.1:13562
> Accept: text/html, image/gif, image/jpeg, *; q=.2, */*; q=.2
> Connection: keep-alive
> content-length: 0, channel='a9df992f'
> 12:04:17.489 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - Received from request header: 
> ShuffleVersion=1.0.0 header name=mapreduce, channel id: a9df992f
> 12:04:17.489 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - RECV: 
> /mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0
>   mapId: [attempt_1111111111111_0001_m_000001_0]
>   reduceId: [0]
>   jobId: [job_1111111111111_0001]
>   keepAlive: false
>   channel id: a9df992f
> 12:04:17.489 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - Verifying request. 
> encryptedURL:13562/mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0,
>  hash:sqZvZcFqgV/H8, channel id: a9df992f
> 12:04:17.490 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - Fetcher request verified. encryptedURL: 
> 13562/mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0,
>  reply: ii/HLwQUsdwdA, channel id: a9df992f
> 12:04:17.491 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - Retrieved pathInfo for 
> AttemptPathIdentifier{attemptId='attempt_1111111111111_0001_m_000001_0', 
> jobId='job_1111111111111_0001'} check for corresponding loaded messages to 
> determine whether it was loaded or cached
> 12:04:17.491 [ShuffleHandler Netty Worker #0] DEBUG 
> org.apache.hadoop.mapred.IndexCache - IndexCache MISS: MapId 
> attempt_1111111111111_0001_m_000001_0 not found
> 12:04:17.492 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.h.f.s.i.IOStatisticsContextIntegration - Created instance 
> IOStatisticsContextImpl{id=5, threadId=136, ioStatistics=counters=();
> gauges=();
> minimums=();
> maximums=();
> means=();
> }
> 12:04:17.493 [ShuffleHandler Netty Worker #0] DEBUG 
> o.apache.hadoop.io.nativeio.NativeIO - Got UserName jenkins for ID 910 from 
> the native implementation
> 12:04:17.494 [ShuffleHandler Netty Worker #0] DEBUG 
> o.apache.hadoop.io.nativeio.NativeIO - Got GroupName jenkins for ID 910 from 
> the native implementation
> 12:04:17.498 [ShuffleHandler Netty Worker #0] ERROR 
> o.a.hadoop.mapred.ShuffleHandler - Shuffle error while populating headers. 
> Channel id: a9df992f
> java.io.IOException: Error Reading IndexFile
>       at 
> org.apache.hadoop.mapred.IndexCache.readIndexFileToCache(IndexCache.java:123) 
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
>       at 
> org.apache.hadoop.mapred.IndexCache.getIndexInformation(IndexCache.java:68) 
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
>       at 
> org.apache.hadoop.mapred.ShuffleChannelHandler.getMapOutputInfo(ShuffleChannelHandler.java:360)
>  [classes/:na]
>       at 
> org.apache.hadoop.mapred.ShuffleChannelHandler.populateHeaders(ShuffleChannelHandler.java:381)
>  [classes/:na]
>       at 
> org.apache.hadoop.mapred.ShuffleChannelHandler.channelRead0(ShuffleChannelHandler.java:275)
>  [classes/:na]
>       at 
> org.apache.hadoop.mapred.ShuffleChannelHandler.channelRead0(ShuffleChannelHandler.java:130)
>  [classes/:na]
>       at 
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>  [netty-codec-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327)
>  [netty-codec-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:299)
>  [netty-codec-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722) 
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
>  [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584) 
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) 
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:995)
>  [netty-common-4.1.77.Final.jar:4.1.77.Final]
>       at 
> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) 
> [netty-common-4.1.77.Final.jar:4.1.77.Final]
>       at java.lang.Thread.run(Thread.java:750) [na:1.8.0_352]
> Caused by: java.io.IOException: Owner 'jenkins' for path 
> /tmp/test-shuffle-channel-handler1169959588626711767/job_1111111111111_0001/testUser/attempt_1111111111111_0001_m_000001_0/index
>  did not match expected owner 'testUser'
>       at org.apache.hadoop.io.SecureIOUtils.checkStat(SecureIOUtils.java:299) 
> ~[hadoop-common-3.4.0-SNAPSHOT.jar:na]
>       at 
> org.apache.hadoop.io.SecureIOUtils.forceSecureOpenFSDataInputStream(SecureIOUtils.java:183)
>  ~[hadoop-common-3.4.0-SNAPSHOT.jar:na]
>       at 
> org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:161)
>  ~[hadoop-common-3.4.0-SNAPSHOT.jar:na]
>       at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:71) 
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
>       at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:62) 
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
>       at 
> org.apache.hadoop.mapred.IndexCache.readIndexFileToCache(IndexCache.java:119) 
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
>       ... 32 common frames omitted
> 12:04:17.502 [Time-limited test] DEBUG o.a.hadoop.service.AbstractService - 
> Service: mapreduce_shuffle entered state STOPPED
> 12:04:17.503 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - Executing channelInactive; 
> channel='a9df992f'
> 12:04:17.503 [ShuffleHandler Netty Worker #0] DEBUG 
> o.a.hadoop.mapred.ShuffleHandler - New value of Accepted number of 
> connections=0
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: mapreduce-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: mapreduce-issues-h...@hadoop.apache.org

Reply via email to