See
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/534/display/redirect?page=changes>
Changes:
[noreply] Update nbconvert requirement in /sdks/python
[bulatkazan] [Website] center the main content #24521
[Moritz Mack] [Spark Dataset runner] Add @Experimental and reduce visibility
where
[Moritz Mack] [Spark Dataset runner] Broadcast pipeline options
[noreply] Fix grafana dashboard id (#24524)
[noreply] [Spark runner] Support running (VR) tests with Java 17 (closes #24400)
[noreply] Replaced deprecated finalize with DoFn Teardown (#24516)
[noreply] Bump cloud.google.com/go/storage from 1.28.0 to 1.28.1 in /sdks
(#24517)
[noreply] add clarifier to error message (#24449)
[noreply] Batch rename requests in fileio.WriteToFiles (#24341)
[noreply] Bump golang.org/x/text from 0.4.0 to 0.5.0 in /sdks (#24520)
[noreply] Support for JsonSchema in Kafka Read Schema Transform (#24272)
[noreply] Run go fmt over full go directory with go 1.19 (#24525)
[noreply] Cloudbuild+manualsetup+playground (#24144)
[noreply] Bump golang.org/x/sys from 0.2.0 to 0.3.0 in /sdks (#24519)
[noreply] Bump cloud.google.com/go/bigtable from 1.18.0 to 1.18.1 in /sdks
[noreply] Update from interface{} -> any for core packages (#24505)
[noreply] Implement FileWriteSchemaTransformConfiguration (#24479)
[noreply] Bump cloud.google.com/go/pubsub from 1.27.0 to 1.27.1 in /sdks
(#24518)
[noreply] [Playground] Healthcheck was added (#24227)
[noreply] Update dataflow container version for Pandas upgrade (#24532)
[bulatkazan] [Website] update copy icon positioning #24426
------------------------------------------
[...truncated 51.89 KB...]
> Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above
18.03.
As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.
See:
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker
The push refers to repository
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
d1a1f6b1c730: Preparing
700b281d5753: Preparing
4eb349d962ec: Preparing
7408a82a7e42: Preparing
02ad61795b42: Preparing
ac6c0285b584: Preparing
b05787c02131: Preparing
bb9c5b72887d: Preparing
5d17c1b4e29c: Preparing
18bf5dee9b83: Preparing
12482291af6b: Preparing
b565464e6543: Preparing
39ff44e1cdd8: Preparing
b297316f1209: Preparing
5bb01bf0b3c8: Preparing
7b7f3078e1db: Preparing
826c3ddbb29c: Preparing
b626401ef603: Preparing
9b55156abf26: Preparing
293d5db30c9f: Preparing
03127cdb479b: Preparing
9c742cd6c7a5: Preparing
bb9c5b72887d: Waiting
ac6c0285b584: Waiting
5d17c1b4e29c: Waiting
18bf5dee9b83: Waiting
826c3ddbb29c: Waiting
b626401ef603: Waiting
12482291af6b: Waiting
b565464e6543: Waiting
9c742cd6c7a5: Waiting
39ff44e1cdd8: Waiting
9b55156abf26: Waiting
b297316f1209: Waiting
7b7f3078e1db: Waiting
293d5db30c9f: Waiting
03127cdb479b: Waiting
b05787c02131: Waiting
5bb01bf0b3c8: Waiting
7408a82a7e42: Pushed
4eb349d962ec: Pushed
700b281d5753: Pushed
02ad61795b42: Pushed
d1a1f6b1c730: Pushed
bb9c5b72887d: Pushed
b05787c02131: Pushed
18bf5dee9b83: Pushed
5d17c1b4e29c: Pushed
ac6c0285b584: Pushed
b565464e6543: Pushed
7b7f3078e1db: Layer already exists
826c3ddbb29c: Layer already exists
12482291af6b: Pushed
b626401ef603: Layer already exists
9b55156abf26: Layer already exists
293d5db30c9f: Layer already exists
03127cdb479b: Layer already exists
9c742cd6c7a5: Layer already exists
39ff44e1cdd8: Pushed
b297316f1209: Pushed
5bb01bf0b3c8: Pushed
20221206123729: digest:
sha256:2a215dec48b337d296a7db6c6155aadae4d52e76241cb14e58b0245503d798a6 size:
4934
> Task :sdks:java:testing:load-tests:run
Dec 06, 2022 12:38:19 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 06, 2022 12:38:20 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage 232 files. Enable logging at DEBUG level to see which
files will be staged.
Dec 06, 2022 12:38:20 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names:
ParDo(TimeMonitor)
Dec 06, 2022 12:38:20 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Dec 06, 2022 12:38:23 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 232 files from PipelineOptions.filesToStage to staging location
to prepare for execution.
Dec 06, 2022 12:38:23 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 232 files cached, 0 files newly uploaded in 0
seconds
Dec 06, 2022 12:38:23 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 06, 2022 12:38:23 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <118764 bytes, hash
ed39019e85054221931b477f8d766c4d6b17e3045e04fede37f2c9979958651d> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-7TkBnoUFQiGTG0d_jXZsTWsX4wReBP7eN_LJl5lYZR0.pb
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as
step s1
Dec 06, 2022 12:38:25 PM
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0,
endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000,
endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000,
endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000,
endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000,
endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000,
endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000,
endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000,
endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000,
endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000,
endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000,
endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000,
endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000,
endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000,
endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000,
endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000,
endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000,
endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000,
endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000,
endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000,
endOffset=20000000}]
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Dec 06, 2022 12:38:25 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Dec 06, 2022 12:38:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.45.0-SNAPSHOT
Dec 06, 2022 12:38:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-12-06_04_38_25-18078268946406512901?project=apache-beam-testing
Dec 06, 2022 12:38:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-12-06_04_38_25-18078268946406512901
Dec 06, 2022 12:38:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2022-12-06_04_38_25-18078268946406512901
Dec 06, 2022 12:38:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-12-06T12:38:31.363Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
load0tests0java110dataflow0v20streaming0pardo01-jenkins-12-rbxp. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:39.372Z: Worker configuration: e2-standard-2 in
us-central1-b.
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.579Z: Expanding SplittableParDo operations into
optimizable parts.
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.609Z: Expanding CollectionToSingleton operations into
optimizable parts.
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.682Z: Expanding CoGroupByKey operations into
optimizable parts.
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.712Z: Expanding SplittableProcessKeyed operations
into optimizable parts.
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.743Z: Expanding GroupByKey operations into streaming
Read/Write steps
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.804Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.875Z: Fusing adjacent ParDo, Read, Write, and Flatten
operations
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.895Z: Fusing consumer Read
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read
input/Impulse
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.930Z: Fusing consumer
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.954Z: Fusing consumer
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
into
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:40.988Z: Fusing consumer Read
input/ParDo(StripIds)/ParMultiDo(StripIds) into
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.013Z: Fusing consumer
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read
input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.047Z: Fusing consumer
ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)
Dec 06, 2022 12:38:41 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.079Z: Fusing consumer Step:
0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.104Z: Fusing consumer Step:
1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.138Z: Fusing consumer Step:
2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.175Z: Fusing consumer Step:
3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.200Z: Fusing consumer Step:
4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.234Z: Fusing consumer Step:
5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.258Z: Fusing consumer Step:
6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.294Z: Fusing consumer Step:
7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.328Z: Fusing consumer Step:
8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.362Z: Fusing consumer Step:
9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.396Z: Fusing consumer
ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step:
9/ParMultiDo(CounterOperation)
Dec 06, 2022 12:38:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:41.496Z: Running job using Streaming Engine
Dec 06, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:42.743Z: Starting 5 ****s in us-central1-b...
Dec 06, 2022 12:38:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:38:52.561Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 06, 2022 12:39:18 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:39:17.587Z: Autoscaling: Raised the number of ****s to 4 so
that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 06, 2022 12:39:18 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:39:17.618Z: Autoscaling: Resized **** pool to 4, though
goal was 5. This could be a quota issue.
Dec 06, 2022 12:39:27 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:39:27.094Z: Autoscaling: Raised the number of ****s to 5 so
that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 06, 2022 12:40:29 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:40:28.564Z: Workers have started successfully.
Dec 06, 2022 12:40:29 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:40:28.926Z: All ****s have finished the startup processes
and began to receive work requests.
> Task :sdks:java:testing:load-tests:run FAILED
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=e98afb66-1631-46fd-9b6a-5741a09c1cc7,
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 3119615
log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-3119615.out.log
----- Last 20 lines from daemon log file - daemon-3119615.out.log -----
Dec 06, 2022 12:40:29 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-06T12:40:28.926Z: All ****s have finished the startup processes
and began to receive work requests.
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221206123729
Untagged:
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2a215dec48b337d296a7db6c6155aadae4d52e76241cb14e58b0245503d798a6
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
Remove shutdown hook failed
java.lang.IllegalStateException: Shutdown in progress
at
java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82)
at java.lang.Runtime.removeShutdownHook(Runtime.java:231)
at
org.gradle.process.internal.shutdown.ShutdownHooks.removeShutdownHook(ShutdownHooks.java:38)
at
org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:208)
at
org.gradle.process.internal.DefaultExecHandle.aborted(DefaultExecHandle.java:366)
at
org.gradle.process.internal.ExecHandleRunner.completed(ExecHandleRunner.java:108)
at
org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:84)
at
org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221206123729
Untagged:
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:2a215dec48b337d296a7db6c6155aadae4d52e76241cb14e58b0245503d798a6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]