See
<https://ci-beam.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/7284/display/redirect?page=changes>
Changes:
[noreply] Merge pull request #28053: Editorial pass on the new Transform service
------------------------------------------
[...truncated 514.68 KB...]
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: [CIRCULAR REFERENCE:
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File
/.temp-beam-3ed1a6ac-3769-4833-9ad2-fc1e2f15c0c3/ec32bc0eaa6a0a2e-b8da-485f-abb2-4b566a26aac3
could only be written to 0 of the 1 minReplication nodes. There are 0
datanode(s) running and 0 node(s) are excluded in this operation.
at
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2315)
at
org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:294)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2960)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:904)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:593)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:604)
at
org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:572)
at
org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:556)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1093)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1043)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:971)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2976)
]
Aug 17, 2023 11:11:00 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-08-17T23:10:58.533Z:
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File
/.temp-beam-3ed1a6ac-3769-4833-9ad2-fc1e2f15c0c3/dc03dee798d540ae-5980-40e6-bb34-7f09684ec807
could only be written to 0 of the 1 minReplication nodes. There are 0
datanode(s) running and 0 node(s) are excluded in this operation.
at
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2315)
at
org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:294)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2960)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:904)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:593)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:604)
at
org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:572)
at
org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:556)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1093)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1043)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:971)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2976)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1560)
at org.apache.hadoop.ipc.Client.call(Client.java:1506)
at org.apache.hadoop.ipc.Client.call(Client.java:1403)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118)
at com.sun.proxy.$Proxy114.addBlock(Unknown Source)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:448)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:433)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:166)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:158)
at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:96)
at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:362)
at com.sun.proxy.$Proxy115.addBlock(Unknown Source)
at
org.apache.hadoop.hdfs.DataStreamer.locateFollowingBlock(DataStreamer.java:1846)
at
org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1645)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:710)
Suppressed: java.lang.IllegalArgumentException: Self-suppression not
permitted
at java.lang.Throwable.addSuppressed(Throwable.java:1072)
at
org.apache.beam.sdk.io.FileBasedSink$Writer.closeChannelAndThrow(FileBasedSink.java:1018)
at
org.apache.beam.sdk.io.FileBasedSink$Writer.close(FileBasedSink.java:1047)
at
org.apache.beam.sdk.io.WriteFiles.writeOrClose(WriteFiles.java:650)
at
org.apache.beam.sdk.io.WriteFiles.access$1100(WriteFiles.java:123)
at
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesFn.processElement(WriteFiles.java:618)
at
org.apache.beam.sdk.io.WriteFiles$WriteUnshardedTempFilesFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:185)
at
org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:54)
at
org.apache.beam.runners.dataflow.****.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:115)
at
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:54)
at
org.apache.beam.runners.dataflow.****.SimpleParDoFn$1.output(SimpleParDoFn.java:285)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:275)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:85)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:411)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at
org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:54)
at
org.apache.beam.runners.dataflow.****.SimpleParDoFn$1.output(SimpleParDoFn.java:285)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:275)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:85)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:411)
at
org.apache.beam.sdk.io.avro.AvroIOIT$DeterministicallyConstructAvroRecordsFn.processElement(AvroIOIT.java:216)
at
org.apache.beam.sdk.io.avro.AvroIOIT$DeterministicallyConstructAvroRecordsFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at
org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:54)
at
org.apache.beam.runners.dataflow.****.SimpleParDoFn$1.output(SimpleParDoFn.java:285)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:275)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.access$900(SimpleDoFnRunner.java:85)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:423)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:411)
at
org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeterministicallyConstructTestTextLineFn.processElement(FileBasedIOITHelper.java:49)
at
org.apache.beam.sdk.io.common.FileBasedIOITHelper$DeterministicallyConstructTestTextLineFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:211)
at
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:188)
at
org.apache.beam.runners.dataflow.****.SimpleParDoFn.processElement(SimpleParDoFn.java:340)
at
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:54)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:218)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:169)
at
org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:83)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:319)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: [CIRCULAR REFERENCE:
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File
/.temp-beam-3ed1a6ac-3769-4833-9ad2-fc1e2f15c0c3/dc03dee798d540ae-5980-40e6-bb34-7f09684ec807
could only be written to 0 of the 1 minReplication nodes. There are 0
datanode(s) running and 0 node(s) are excluded in this operation.
at
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:2315)
at
org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:294)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2960)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:904)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:593)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:604)
at
org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:572)
at
org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:556)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1093)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1043)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:971)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2976)
]
Aug 17, 2023 11:11:00 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-08-17T23:10:58.616Z: Finished operation Generate
sequence/Read(BoundedCountingSource)+Produce text lines+Produce Avro
records+Collect start time+Write Avro records to
files/Write/RewindowIntoGlobal/Window.Assign+Write Avro records to
files/Write/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write Avro
records to files/Write/GatherTempFileResults/Consolidate/Pair with random
key+Write Avro records to
files/Write/GatherTempFileResults/Consolidate/Reshuffle/Window.Into()/Window.Assign+Write
Avro records to
files/Write/GatherTempFileResults/Consolidate/Reshuffle/GroupByKey/Reify+Write
Avro records to
files/Write/GatherTempFileResults/Consolidate/Reshuffle/GroupByKey/Write+Write
Avro records to
files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write Avro
records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Aug 17, 2023 11:11:00 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-08-17T23:10:58.693Z: Workflow failed. Causes: S03:Generate
sequence/Read(BoundedCountingSource)+Produce text lines+Produce Avro
records+Collect start time+Write Avro records to
files/Write/RewindowIntoGlobal/Window.Assign+Write Avro records to
files/Write/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write Avro
records to files/Write/GatherTempFileResults/Consolidate/Pair with random
key+Write Avro records to
files/Write/GatherTempFileResults/Consolidate/Reshuffle/Window.Into()/Window.Assign+Write
Avro records to
files/Write/GatherTempFileResults/Consolidate/Reshuffle/GroupByKey/Reify+Write
Avro records to
files/Write/GatherTempFileResults/Consolidate/Reshuffle/GroupByKey/Write+Write
Avro records to
files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write Avro
records to files/Write/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
failed., The job failed because a work item has failed 4 times. Look in
previous log entries for the cause of each one of the 4 failures. If the logs
only contain generic timeout errors related to accessing external resources,
such as MongoDB, verify that the **** service account has permission to access
the resource's subnetwork. For more information, see
https://cloud.google.com/dataflow/docs/guides/common-errors. The work item was
attempted on these ****s:
Root cause: Work item failed.
Worker ID: avroioit0writethenreadall-08171608-eizg-harness-j52g,
Root cause: Work item failed.
Worker ID: avroioit0writethenreadall-08171608-eizg-harness-47rd,
Root cause: Work item failed.
Worker ID: avroioit0writethenreadall-08171608-eizg-harness-t73g,
Root cause: Work item failed.
Worker ID: avroioit0writethenreadall-08171608-eizg-harness-nsph
Aug 17, 2023 11:11:00 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-08-17T23:10:58.772Z: Cleaning up.
Aug 17, 2023 11:11:00 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-08-17T23:10:58.876Z: Stopping **** pool...
Aug 17, 2023 11:13:28 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-08-17T23:13:28.209Z: Autoscaling: Resized **** pool from 5 to 0.
Aug 17, 2023 11:13:28 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-08-17T23:13:28.269Z: Worker pool stopped.
Aug 17, 2023 11:14:05 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2023-08-17_16_08_26-9007528499468311491 failed with status FAILED.
org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_OUT
Load test results for test (ID): debd63c1-a718-444a-86f0-1fedef8fc2ec and
timestamp: 2023-08-17T23:14:05.896000000Z:
Metric: Value:
dataset_size 1.08973E9
write_time 0.0
run_time 0.0
read_time 0.0
org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_ERROR
ERROR StatusLogger Log4j2 could not find a logging implementation. Please
add log4j-core to the classpath. Using SimpleLogger to log to the console...
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:file-based-io-tests:integrationTest FAILED
org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll FAILED
java.lang.AssertionError: Values should be different. Actual: FAILED
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failEquals(Assert.java:187)
at org.junit.Assert.assertNotEquals(Assert.java:163)
at org.junit.Assert.assertNotEquals(Assert.java:177)
at
org.apache.beam.sdk.io.avro.AvroIOIT.writeThenReadAll(AvroIOIT.java:165)
1 test completed, 1 failed
Finished generating test XML results (0.035 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.042 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest>
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:file-based-io-tests:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html>
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 6m 53s
165 actionable tasks: 96 executed, 67 from cache, 2 up-to-date
Publishing build scan...
Publishing failed.
The response from https://ge.apache.org/scans/publish/gradle/3.13.2/token was
not from Gradle Enterprise.
The specified server address may be incorrect, or your network environment may
be interfering.
Please report this problem to your Gradle Enterprise administrator via
https://ge.apache.org/help and include the following via copy/paste:
----------
Gradle version: 7.6.2
Plugin version: 3.13.2
Request URL: https://ge.apache.org/scans/publish/gradle/3.13.2/token
Request ID: 9749266f-0481-4d7a-89f8-e0fc57520073
Response status code: 502
Response content type: text/html
----------
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]