See <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/353/display/redirect>
Changes: ------------------------------------------ [...truncated 606.24 KB...] #23 exporting to image #23 exporting layers #23 exporting layers 0.3s done #23 writing image sha256:d98b30380aa0a5df2c27bafcb14219f40c9ea6afa986a382f68dd892ab7fab91 done #23 naming to docker.io/apache/beam_java8_sdk:2.48.0.dev done #23 DONE 0.3s :sdks:java:container:java8:docker (Thread[Execution ****,5,main]) completed. Took 5.638 secs. Resolve mutations for :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer (Thread[Execution **** Thread 7,5,main]) started. Resolve mutations for :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer (Thread[Execution **** Thread 6,5,main]) started. > Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer Custom actions are attached to task ':runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer'. Caching disabled for task ':runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer' because: Gradle would require more information to cache this task Task ':runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer' is not up-to-date because: Task has not declared any outputs despite executing actions. Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker tag apache/beam_java8_sdk:2.48.0.dev us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230406125655 Successfully started process 'command 'docker'' Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud docker -- push us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230406125655 Successfully started process 'command 'gcloud'' WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03. As an alternative, use `gcloud auth configure-docker` to configure `docker` to use `gcloud` as a credential helper, then use `docker` as you would for non-GCR registries, e.g. `docker pull gcr.io/project-id/my-image`. Add `--verbosity=error` to silence this warning: `gcloud docker --verbosity=error -- pull gcr.io/project-id/my-image`. See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] 4c410e96f818: Preparing 5f70bf18a086: Preparing 2dd4b9830ae5: Preparing 2f0c75fdeba0: Preparing 7474bba80327: Preparing 21c6bf8321af: Preparing 7b9101beb1bf: Preparing 3432404bda30: Preparing c935d51bc5c7: Preparing 24692018994d: Preparing f85100755ce1: Preparing 3adbe58e02c0: Preparing 837de7dc8408: Preparing 22c629eb3977: Preparing eb6f59c27140: Preparing 96399d03c4a0: Preparing aaa7f3126e65: Preparing 533af4d05d89: Preparing d09fdd9ad3ef: Preparing 04d1dcab20cb: Preparing b93c1bd012ab: Preparing 3432404bda30: Waiting c935d51bc5c7: Waiting f85100755ce1: Waiting 24692018994d: Waiting 3adbe58e02c0: Waiting aaa7f3126e65: Waiting 837de7dc8408: Waiting 533af4d05d89: Waiting 22c629eb3977: Waiting d09fdd9ad3ef: Waiting eb6f59c27140: Waiting 04d1dcab20cb: Waiting 7b9101beb1bf: Waiting 21c6bf8321af: Waiting 96399d03c4a0: Waiting b93c1bd012ab: Waiting 5f70bf18a086: Layer already exists 7474bba80327: Pushed 2f0c75fdeba0: Pushed 2dd4b9830ae5: Pushed 21c6bf8321af: Pushed 4c410e96f818: Pushed 3432404bda30: Pushed c935d51bc5c7: Pushed f85100755ce1: Pushed 7b9101beb1bf: Pushed 24692018994d: Pushed 837de7dc8408: Pushed 3adbe58e02c0: Pushed 533af4d05d89: Layer already exists d09fdd9ad3ef: Layer already exists 04d1dcab20cb: Layer already exists b93c1bd012ab: Layer already exists 22c629eb3977: Pushed eb6f59c27140: Pushed aaa7f3126e65: Pushed 96399d03c4a0: Pushed 20230406125655: digest: sha256:ddf81814eb902530955b173ed5f6ccfec49a24ef5297e94989474232746823bc size: 4710 :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer (Thread[Execution **** Thread 6,5,main]) completed. Took 6.899 secs. Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution ****,5,main]) started. Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution ****,5,main]) completed. Took 0.0 secs. :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 7,5,main]) started. producer locations for task group 0 (Thread[Execution ****,5,main]) started. producer locations for task group 0 (Thread[Execution ****,5,main]) completed. Took 0.0 secs. Gradle Test Executor 2 started executing tests. > Task :sdks:java:io:sparkreceiver:2:integrationTest Custom actions are attached to task ':sdks:java:io:sparkreceiver:2:integrationTest'. Build cache key for task ':sdks:java:io:sparkreceiver:2:integrationTest' is 4d756da66374261468d9f1256b8afbca Task ':sdks:java:io:sparkreceiver:2:integrationTest' is not up-to-date because: Task.upToDateWhen is false. Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"5000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=sparkreceiverioit_results","--influxMeasurement=sparkreceiverioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--rabbitMqBootstrapServerAddress=amqp://guest:[email protected]:5672","--streamName=rabbitMqTestStream","--readTimeout=1800","--numWorkers=1","--autoscalingAlgorithm=NONE","--experiments=use_runner_v2","--sdkContainerImage=us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230406125655","--region=us-central1"] -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/tmp/integrationTest/work> -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 2' Successfully started process 'Gradle Test Executor 2' org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.48.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory] org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset STANDARD_ERROR [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - 5000000 records were successfully written to RabbitMQ [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO - ReadFromSparkReceiverWithOffsetDoFn started reading [Test ****] INFO org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory - No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903 [Test ****] WARN org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. [Test ****] INFO org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory - No stagingLocation provided, falling back to gcpTempLocation [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 428 files. Enable logging at DEBUG level to see which files will be staged. [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading 428 files from PipelineOptions.filesToStage to staging location to prepare for execution. [pool-8-thread-18] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading /tmp/test5547566363273348201.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-ByIegv9wmULwv3lGYpIedZLodkI9EiGpi3OJ-ITkbnw.jar [pool-8-thread-32] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading /tmp/main7986834544455131246.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/main-TLG40ke3CXM4AldSMDwu0yyNTrX4Mp9ORjJ5IJWlJg8.jar [pool-8-thread-9] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading /tmp/test6542215117668548436.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-13A3kR0wQ8ZASUIMIHv_r6FuZPPYKHtEjuWUr84BZfg.jar [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Staging files complete: 425 files cached, 3 files newly uploaded in 2 seconds [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Staging portable pipeline proto to gs://dataflow-staging-us-central1-844138762903/temp/staging/ [pool-15-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading <161055 bytes, hash 19c963446e24c2d7e266007d678811662b1ebf94f3e0d4823ab678a32ee5e191> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-GcljRG4kwtfiZgB9Z4gRZisev5Tz4NSCOrZ4oy7l4ZE.pb [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse as step s1 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair with initial restriction as step s2 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split restriction as step s3 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode windows as step s4 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign unique key/AddKeys/Map as step s5 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/ProcessKeyedElements as step s6 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Measure read time as step s7 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Counting element as step s8 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Dataflow SDK version: 2.48.0-SNAPSHOT [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-06_06_02_44-14018253160088645983?project=apache-beam-testing [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Submitted job: 2023-04-06_06_02_44-14018253160088645983 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2023-04-06_06_02_44-14018253160088645983 [Test ****] WARN org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:02:51.216Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: sparkreceiverioit0testsparkreceiverioreadsinstreamingwitho-us7d. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:09.666Z: Worker configuration: e2-standard-2 in us-central1-f. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.006Z: Expanding SplittableParDo operations into optimizable parts. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.039Z: Expanding CollectionToSingleton operations into optimizable parts. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.112Z: Expanding CoGroupByKey operations into optimizable parts. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.169Z: Expanding SplittableProcessKeyed operations into optimizable parts. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.197Z: Expanding GroupByKey operations into streaming Read/Write steps [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.229Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.316Z: Fusing adjacent ParDo, Read, Write, and Flatten operations [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.344Z: Fusing consumer Read-from-unbounded-RabbitMq-SparkReceiverIO-ReadFromSparkReceiverViaSdf-ParDo-ReadFromSparkReceiver/PairWithRestriction into Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.376Z: Fusing consumer Read-from-unbounded-RabbitMq-SparkReceiverIO-ReadFromSparkReceiverViaSdf-ParDo-ReadFromSparkReceiver/SplitWithSizing into Read-from-unbounded-RabbitMq-SparkReceiverIO-ReadFromSparkReceiverViaSdf-ParDo-ReadFromSparkReceiver/PairWithRestriction [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.410Z: Fusing consumer Measure read time/ParMultiDo(TimeMonitor) into Read-from-unbounded-RabbitMq-SparkReceiverIO-ReadFromSparkReceiverViaSdf-ParDo-ReadFromSparkReceiver/ProcessElementAndRestrictionWithSizing [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.442Z: Fusing consumer Counting element/ParMultiDo(Counting) into Measure read time/ParMultiDo(TimeMonitor) [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:11.577Z: Running job using Streaming Engine [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:12.930Z: Starting 1 ****s in us-central1-f... [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:25.082Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:03:57.912Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:04:58.201Z: Workers have started successfully. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-06T13:04:58.920Z: All ****s have finished the startup processes and began to receive work requests. [Test ****] WARN org.apache.beam.runners.dataflow.DataflowPipelineJob - No terminal state was returned within allotted timeout. State value RUNNING [Test ****] ERROR org.apache.beam.sdk.testutils.metrics.MetricsReader - Failed to get metric spark_read_element_count, from namespace org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset FAILED java.lang.AssertionError: expected:<5000000> but was:<-1> at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotEquals(Assert.java:835) at org.junit.Assert.assertEquals(Assert.java:647) at org.junit.Assert.assertEquals(Assert.java:633) at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.testSparkReceiverIOReadsInStreamingWithOffset(SparkReceiverIOIT.java:337) Gradle Test Executor 2 finished executing tests. > Task :sdks:java:io:sparkreceiver:2:integrationTest 1 test completed, 1 failed Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/test-results/integrationTest> Generating HTML test report... Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest> > Task :sdks:java:io:sparkreceiver:2:integrationTest FAILED :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 7,5,main]) completed. Took 35 mins 46.211 secs. Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 4,5,main]) started. Resolve mutations for :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 3,5,main]) started. > Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages Custom actions are attached to task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'. Caching disabled for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because: Gradle would require more information to cache this task Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not up-to-date because: Task has not declared any outputs despite executing actions. Starting process 'command 'docker''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java> Command: docker rmi --force us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230406125655 Successfully started process 'command 'docker'' Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230406125655 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ddf81814eb902530955b173ed5f6ccfec49a24ef5297e94989474232746823bc Starting process 'command 'gcloud''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java> Command: gcloud --quiet container images untag us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230406125655 Successfully started process 'command 'gcloud'' WARNING: Successfully resolved tag to sha256, but it is recommended to use sha256 directly. Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230406125655] - referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ddf81814eb902530955b173ed5f6ccfec49a24ef5297e94989474232746823bc] Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20230406125655] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ddf81814eb902530955b173ed5f6ccfec49a24ef5297e94989474232746823bc])]. Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java> Command: ./scripts/cleanup_untagged_gcr_images.sh us.gcr.io/apache-beam-testing/java-postcommit-it/java Successfully started process 'command './scripts/cleanup_untagged_gcr_images.sh'' Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ddf81814eb902530955b173ed5f6ccfec49a24ef5297e94989474232746823bc Digests: - us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ddf81814eb902530955b173ed5f6ccfec49a24ef5297e94989474232746823bc Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ddf81814eb902530955b173ed5f6ccfec49a24ef5297e94989474232746823bc]. :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution **** Thread 3,5,main]) completed. Took 4.725 secs. Resolve mutations for :sdks:java:io:sparkreceiver:2:cleanUp (Thread[Execution **** Thread 4,5,main]) started. Resolve mutations for :sdks:java:io:sparkreceiver:2:cleanUp (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs. :sdks:java:io:sparkreceiver:2:cleanUp (Thread[Execution **** Thread 3,5,main]) started. > Task :sdks:java:io:sparkreceiver:2:cleanUp Skipping task ':sdks:java:io:sparkreceiver:2:cleanUp' as it has no actions. :sdks:java:io:sparkreceiver:2:cleanUp (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:io:sparkreceiver:2:integrationTest'. > There were failing tests. See the report at: > file://<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest/index.html> * Try: > Run with --stacktrace option to get the stack trace. > Run with --debug option to get more log output. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 36m 53s 166 actionable tasks: 106 executed, 56 from cache, 4 up-to-date Publishing build scan... https://gradle.com/s/6cr3fxmjlz36w Stopped 1 **** daemon(s). Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
