[
https://issues.apache.org/jira/browse/MAPREDUCE-7434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17692277#comment-17692277
]
ASF GitHub Bot commented on MAPREDUCE-7434:
-------------------------------------------
hadoop-yetus commented on PR #5426:
URL: https://github.com/apache/hadoop/pull/5426#issuecomment-1440413098
:confetti_ball: **+1 overall**
| Vote | Subsystem | Runtime | Logfile | Comment |
|:----:|----------:|--------:|:--------:|:-------:|
| +0 :ok: | reexec | 0m 47s | | Docker mode activated. |
|||| _ Prechecks _ |
| +1 :green_heart: | dupname | 0m 0s | | No case conflicting files
found. |
| +0 :ok: | codespell | 0m 0s | | codespell was not available. |
| +0 :ok: | detsecrets | 0m 0s | | detect-secrets was not available.
|
| +1 :green_heart: | @author | 0m 0s | | The patch does not contain
any @author tags. |
| +1 :green_heart: | test4tests | 0m 0s | | The patch appears to
include 3 new or modified test files. |
|||| _ trunk Compile Tests _ |
| +1 :green_heart: | mvninstall | 47m 9s | | trunk passed |
| +1 :green_heart: | compile | 0m 26s | | trunk passed with JDK
Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04 |
| +1 :green_heart: | compile | 0m 23s | | trunk passed with JDK
Private Build-1.8.0_352-8u352-ga-1~20.04-b08 |
| +1 :green_heart: | checkstyle | 0m 25s | | trunk passed |
| +1 :green_heart: | mvnsite | 0m 27s | | trunk passed |
| +1 :green_heart: | javadoc | 0m 31s | | trunk passed with JDK
Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04 |
| +1 :green_heart: | javadoc | 0m 21s | | trunk passed with JDK
Private Build-1.8.0_352-8u352-ga-1~20.04-b08 |
| +1 :green_heart: | spotbugs | 0m 49s | | trunk passed |
| +1 :green_heart: | shadedclient | 26m 2s | | branch has no errors
when building and testing our client artifacts. |
|||| _ Patch Compile Tests _ |
| +1 :green_heart: | mvninstall | 0m 27s | | the patch passed |
| +1 :green_heart: | compile | 0m 20s | | the patch passed with JDK
Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04 |
| +1 :green_heart: | javac | 0m 20s | | the patch passed |
| +1 :green_heart: | compile | 0m 18s | | the patch passed with JDK
Private Build-1.8.0_352-8u352-ga-1~20.04-b08 |
| +1 :green_heart: | javac | 0m 18s | | the patch passed |
| +1 :green_heart: | blanks | 0m 0s | | The patch has no blanks
issues. |
| -0 :warning: | checkstyle | 0m 11s |
[/results-checkstyle-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-shuffle.txt](https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5426/1/artifact/out/results-checkstyle-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-shuffle.txt)
|
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle:
The patch generated 1 new + 0 unchanged - 0 fixed = 1 total (was 0) |
| +1 :green_heart: | mvnsite | 0m 19s | | the patch passed |
| +1 :green_heart: | javadoc | 0m 17s | | the patch passed with JDK
Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04 |
| +1 :green_heart: | javadoc | 0m 16s | | the patch passed with JDK
Private Build-1.8.0_352-8u352-ga-1~20.04-b08 |
| +1 :green_heart: | spotbugs | 0m 42s | | the patch passed |
| +1 :green_heart: | shadedclient | 27m 21s | | patch has no errors
when building and testing our client artifacts. |
|||| _ Other Tests _ |
| +1 :green_heart: | unit | 0m 36s | | hadoop-mapreduce-client-shuffle
in the patch passed. |
| +1 :green_heart: | asflicense | 0m 33s | | The patch does not
generate ASF License warnings. |
| | | 110m 25s | | |
| Subsystem | Report/Notes |
|----------:|:-------------|
| Docker | ClientAPI=1.42 ServerAPI=1.42 base:
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5426/1/artifact/out/Dockerfile
|
| GITHUB PR | https://github.com/apache/hadoop/pull/5426 |
| Optional Tests | dupname asflicense compile javac javadoc mvninstall
mvnsite unit shadedclient spotbugs checkstyle codespell detsecrets |
| uname | Linux 5d4aacfc7497 4.15.0-200-generic #211-Ubuntu SMP Thu Nov 24
18:16:04 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux |
| Build tool | maven |
| Personality | dev-support/bin/hadoop.sh |
| git revision | trunk / 5f8d5f2a6f9a20621e4b51a035ce4896e41f5be7 |
| Default Java | Private Build-1.8.0_352-8u352-ga-1~20.04-b08 |
| Multi-JDK versions |
/usr/lib/jvm/java-11-openjdk-amd64:Ubuntu-11.0.17+8-post-Ubuntu-1ubuntu220.04
/usr/lib/jvm/java-8-openjdk-amd64:Private Build-1.8.0_352-8u352-ga-1~20.04-b08 |
| Test Results |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5426/1/testReport/ |
| Max. process+thread count | 536 (vs. ulimit of 5500) |
| modules | C:
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle
U:
hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-shuffle
|
| Console output |
https://ci-hadoop.apache.org/job/hadoop-multibranch/job/PR-5426/1/console |
| versions | git=2.25.1 maven=3.6.3 spotbugs=4.2.2 |
| Powered by | Apache Yetus 0.14.0 https://yetus.apache.org |
This message was automatically generated.
> Fix ShuffleHandler tests
> ------------------------
>
> Key: MAPREDUCE-7434
> URL: https://issues.apache.org/jira/browse/MAPREDUCE-7434
> Project: Hadoop Map/Reduce
> Issue Type: Bug
> Affects Versions: 3.4.0
> Reporter: Tamas Domok
> Assignee: Tamas Domok
> Priority: Major
> Labels: pull-request-available
>
> https://ci-hadoop.apache.org/view/Hadoop/job/hadoop-qbt-trunk-java8-linux-x86_64/1143/testReport/junit/org.apache.hadoop.mapred/TestShuffleHandler/testMapFileAccess/
> {code}
> Error Message
> Server returned HTTP response code: 500 for URL:
> http://127.0.0.1:13562/mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0
> Stacktrace
> java.io.IOException: Server returned HTTP response code: 500 for URL:
> http://127.0.0.1:13562/mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0
> at
> sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1902)
> at
> sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1500)
> at
> org.apache.hadoop.mapred.TestShuffleHandler.testMapFileAccess(TestShuffleHandler.java:292)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
> at
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> at
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
> at
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> at
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:299)
> at
> org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:293)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at java.lang.Thread.run(Thread.java:750)
> Standard Output
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory -
> field org.apache.hadoop.metrics2.lib.MutableGaugeInt
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleConnections
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false,
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[#
> of current shuffle connections])
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory -
> field org.apache.hadoop.metrics2.lib.MutableCounterLong
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputBytes
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false,
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT,
> value=[Shuffle output in bytes])
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory -
> field org.apache.hadoop.metrics2.lib.MutableCounterInt
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputsFailed
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false,
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[#
> of failed shuffle outputs])
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory -
> field org.apache.hadoop.metrics2.lib.MutableCounterInt
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputsOK with
> annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false,
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[#
> of succeeeded shuffle outputs])
> 12:04:17.466 [Time-limited test] DEBUG o.a.h.m.impl.MetricsSystemImpl -
> ShuffleMetrics, Shuffle output metrics
> 12:04:17.467 [Time-limited test] DEBUG o.a.hadoop.service.AbstractService -
> Service: mapreduce_shuffle entered state INITED
> 12:04:17.477 [Time-limited test] DEBUG o.a.hadoop.service.AbstractService -
> Config has been overridden during init
> 12:04:17.478 [Time-limited test] INFO org.apache.hadoop.mapred.IndexCache -
> IndexCache created with max memory = 10485760
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory -
> field org.apache.hadoop.metrics2.lib.MutableGaugeInt
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleConnections
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false,
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[#
> of current shuffle connections])
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory -
> field org.apache.hadoop.metrics2.lib.MutableCounterLong
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputBytes
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false,
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT,
> value=[Shuffle output in bytes])
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory -
> field org.apache.hadoop.metrics2.lib.MutableCounterInt
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputsFailed
> with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false,
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[#
> of failed shuffle outputs])
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.lib.MutableMetricsFactory -
> field org.apache.hadoop.metrics2.lib.MutableCounterInt
> org.apache.hadoop.mapred.ShuffleHandler$ShuffleMetrics.shuffleOutputsOK with
> annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false,
> sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[#
> of succeeeded shuffle outputs])
> 12:04:17.479 [Time-limited test] DEBUG o.a.h.m.impl.MetricsSystemImpl -
> ShuffleMetrics, Shuffle output metrics
> 12:04:17.482 [Time-limited test] INFO o.a.hadoop.mapred.ShuffleHandler -
> mapreduce_shuffle listening on port 13562
> 12:04:17.482 [Time-limited test] DEBUG o.a.hadoop.service.AbstractService -
> Service mapreduce_shuffle is started
> 12:04:17.483 [Time-limited test] INFO o.a.hadoop.mapred.ShuffleHandler -
> Added token for job_1111111111111_0001
> 12:04:17.486 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - ShuffleChannelInitializer init;
> channel='a9df992f'
> 12:04:17.487 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - Executing channelActive; channel='a9df992f'
> 12:04:17.487 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - Added channel: [id: 0xa9df992f,
> L:/127.0.0.1:13562 - R:/127.0.0.1:53014], channel id: a9df992f. Accepted
> number of connections=1
> 12:04:17.489 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - Received HTTP request:
> HttpObjectAggregator$AggregatedFullHttpRequest(decodeResult: success,
> version: HTTP/1.1, content: CompositeByteBuf(ridx: 0, widx: 0, cap: 0,
> components=0))
> GET
> /mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0
> HTTP/1.1
> name: mapreduce
> version: 1.0.0
> UrlHash: +OVYyZ3+ni316LsqZvZcFqgV/H8=
> User-Agent: Java/1.8.0_352
> Host: 127.0.0.1:13562
> Accept: text/html, image/gif, image/jpeg, *; q=.2, */*; q=.2
> Connection: keep-alive
> content-length: 0, channel='a9df992f'
> 12:04:17.489 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - Received from request header:
> ShuffleVersion=1.0.0 header name=mapreduce, channel id: a9df992f
> 12:04:17.489 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - RECV:
> /mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0
> mapId: [attempt_1111111111111_0001_m_000001_0]
> reduceId: [0]
> jobId: [job_1111111111111_0001]
> keepAlive: false
> channel id: a9df992f
> 12:04:17.489 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - Verifying request.
> encryptedURL:13562/mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0,
> hash:sqZvZcFqgV/H8, channel id: a9df992f
> 12:04:17.490 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - Fetcher request verified. encryptedURL:
> 13562/mapOutput?job=job_1111111111111_0001&reduce=0&map=attempt_1111111111111_0001_m_000001_0,
> reply: ii/HLwQUsdwdA, channel id: a9df992f
> 12:04:17.491 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - Retrieved pathInfo for
> AttemptPathIdentifier{attemptId='attempt_1111111111111_0001_m_000001_0',
> jobId='job_1111111111111_0001'} check for corresponding loaded messages to
> determine whether it was loaded or cached
> 12:04:17.491 [ShuffleHandler Netty Worker #0] DEBUG
> org.apache.hadoop.mapred.IndexCache - IndexCache MISS: MapId
> attempt_1111111111111_0001_m_000001_0 not found
> 12:04:17.492 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.h.f.s.i.IOStatisticsContextIntegration - Created instance
> IOStatisticsContextImpl{id=5, threadId=136, ioStatistics=counters=();
> gauges=();
> minimums=();
> maximums=();
> means=();
> }
> 12:04:17.493 [ShuffleHandler Netty Worker #0] DEBUG
> o.apache.hadoop.io.nativeio.NativeIO - Got UserName jenkins for ID 910 from
> the native implementation
> 12:04:17.494 [ShuffleHandler Netty Worker #0] DEBUG
> o.apache.hadoop.io.nativeio.NativeIO - Got GroupName jenkins for ID 910 from
> the native implementation
> 12:04:17.498 [ShuffleHandler Netty Worker #0] ERROR
> o.a.hadoop.mapred.ShuffleHandler - Shuffle error while populating headers.
> Channel id: a9df992f
> java.io.IOException: Error Reading IndexFile
> at
> org.apache.hadoop.mapred.IndexCache.readIndexFileToCache(IndexCache.java:123)
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
> at
> org.apache.hadoop.mapred.IndexCache.getIndexInformation(IndexCache.java:68)
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
> at
> org.apache.hadoop.mapred.ShuffleChannelHandler.getMapOutputInfo(ShuffleChannelHandler.java:360)
> [classes/:na]
> at
> org.apache.hadoop.mapred.ShuffleChannelHandler.populateHeaders(ShuffleChannelHandler.java:381)
> [classes/:na]
> at
> org.apache.hadoop.mapred.ShuffleChannelHandler.channelRead0(ShuffleChannelHandler.java:275)
> [classes/:na]
> at
> org.apache.hadoop.mapred.ShuffleChannelHandler.channelRead0(ShuffleChannelHandler.java:130)
> [classes/:na]
> at
> io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
> [netty-codec-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327)
> [netty-codec-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:299)
> [netty-codec-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
> [netty-transport-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:995)
> [netty-common-4.1.77.Final.jar:4.1.77.Final]
> at
> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
> [netty-common-4.1.77.Final.jar:4.1.77.Final]
> at java.lang.Thread.run(Thread.java:750) [na:1.8.0_352]
> Caused by: java.io.IOException: Owner 'jenkins' for path
> /tmp/test-shuffle-channel-handler1169959588626711767/job_1111111111111_0001/testUser/attempt_1111111111111_0001_m_000001_0/index
> did not match expected owner 'testUser'
> at org.apache.hadoop.io.SecureIOUtils.checkStat(SecureIOUtils.java:299)
> ~[hadoop-common-3.4.0-SNAPSHOT.jar:na]
> at
> org.apache.hadoop.io.SecureIOUtils.forceSecureOpenFSDataInputStream(SecureIOUtils.java:183)
> ~[hadoop-common-3.4.0-SNAPSHOT.jar:na]
> at
> org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:161)
> ~[hadoop-common-3.4.0-SNAPSHOT.jar:na]
> at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:71)
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
> at org.apache.hadoop.mapred.SpillRecord.<init>(SpillRecord.java:62)
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
> at
> org.apache.hadoop.mapred.IndexCache.readIndexFileToCache(IndexCache.java:119)
> ~[hadoop-mapreduce-client-core-3.4.0-SNAPSHOT.jar:na]
> ... 32 common frames omitted
> 12:04:17.502 [Time-limited test] DEBUG o.a.hadoop.service.AbstractService -
> Service: mapreduce_shuffle entered state STOPPED
> 12:04:17.503 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - Executing channelInactive;
> channel='a9df992f'
> 12:04:17.503 [ShuffleHandler Netty Worker #0] DEBUG
> o.a.hadoop.mapred.ShuffleHandler - New value of Accepted number of
> connections=0
> {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]