See
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/535/display/redirect?page=changes>
Changes:
[noreply] Update google-cloud-bigquery-storage requirement from <2.14,>=2.6.3 to
[noreply] Bump cloud.google.com/go/pubsub from 1.27.1 to 1.28.0 in /sdks
(#24534)
[noreply] Bump golang.org/x/net from 0.2.0 to 0.3.0 in /sdks (#24544)
[noreply] Precommit python version update (#24526)
[noreply] [CdapIO] Add CdapIO and SparkReceiverIO documentation in website
[relax] fix null pointer exception caused by clearing member variable
[noreply] Adding support for Pubsub Lite Writes in SchemaTransforms (#24359)
[noreply] Disallow using the JRH with Python streaming pipelines (#24513)
[noreply] Add RunInference example for TensorFlow Hub pre-trained model (#24529)
[noreply] update(PULL Request template) remove Choose reviewer (#24540)
[noreply] Revert "Bump actions/setup-java from 3.6.0 to 3.7.0 (#24484)" (#24551)
[noreply] Interface{}->any for more subfolders (#24553)
[Kenneth Knowles] Support multiple gradle tasks in one precommit job
[Kenneth Knowles] Split up some IOs from Java PreCommit
------------------------------------------
[...truncated 49.30 KB...]
> Task :sdks:java:container:java11:copySdkHarnessLauncher
Execution optimizations have been disabled for task
':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due
to the following reasons:
- Gradle detected a problem with the following location:
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.>
Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this
output of task ':sdks:java:container:downloadCloudProfilerAgent' without
declaring an explicit or implicit dependency. This can lead to incorrect
results being produced, depending on what order the tasks are executed. Please
refer to
https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency
for more details about this problem.
- Gradle detected a problem with the following location:
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.>
Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this
output of task ':sdks:java:container:pullLicenses' without declaring an
explicit or implicit dependency. This can lead to incorrect results being
produced, depending on what order the tasks are executed. Please refer to
https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency
for more details about this problem.
> Task :release:go-licenses:java:createLicenses
> Task :sdks:java:container:java11:copyGolangLicenses
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker
> Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above
18.03.
As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.
See:
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker
The push refers to repository
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
95a9a98164d8: Preparing
eed01ff983ee: Preparing
e26e0eb3833b: Preparing
f4f5f33422ae: Preparing
0be9f8068ec7: Preparing
a0f0d80feddf: Preparing
6df0fe24985c: Preparing
e8067e356a9e: Preparing
d3bca2a91858: Preparing
4eba9eef2309: Preparing
3c209a82ca9f: Preparing
4ad45f74dfb8: Preparing
91e3f1412ad7: Preparing
b24a0d501b6f: Preparing
ee71fe715760: Preparing
7b7f3078e1db: Preparing
826c3ddbb29c: Preparing
b626401ef603: Preparing
9b55156abf26: Preparing
293d5db30c9f: Preparing
6df0fe24985c: Waiting
03127cdb479b: Preparing
9c742cd6c7a5: Preparing
e8067e356a9e: Waiting
3c209a82ca9f: Waiting
d3bca2a91858: Waiting
4ad45f74dfb8: Waiting
4eba9eef2309: Waiting
91e3f1412ad7: Waiting
b24a0d501b6f: Waiting
9b55156abf26: Waiting
b626401ef603: Waiting
9c742cd6c7a5: Waiting
826c3ddbb29c: Waiting
293d5db30c9f: Waiting
a0f0d80feddf: Waiting
03127cdb479b: Waiting
7b7f3078e1db: Waiting
ee71fe715760: Waiting
eed01ff983ee: Pushed
f4f5f33422ae: Pushed
e26e0eb3833b: Pushed
0be9f8068ec7: Pushed
95a9a98164d8: Pushed
6df0fe24985c: Pushed
e8067e356a9e: Pushed
4eba9eef2309: Pushed
a0f0d80feddf: Pushed
3c209a82ca9f: Pushed
4ad45f74dfb8: Pushed
d3bca2a91858: Pushed
7b7f3078e1db: Layer already exists
826c3ddbb29c: Layer already exists
9b55156abf26: Layer already exists
b626401ef603: Layer already exists
293d5db30c9f: Layer already exists
03127cdb479b: Layer already exists
9c742cd6c7a5: Layer already exists
91e3f1412ad7: Pushed
b24a0d501b6f: Pushed
ee71fe715760: Pushed
20221207123730: digest:
sha256:392be829acd5b5cb0ca225823599ae71ac181228b5715b2fb7ec45686f9f3e92 size:
4934
> Task :sdks:java:testing:load-tests:run
Dec 07, 2022 12:38:24 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 07, 2022 12:38:25 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage 232 files. Enable logging at DEBUG level to see which
files will be staged.
Dec 07, 2022 12:38:26 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names:
ParDo(TimeMonitor)
Dec 07, 2022 12:38:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Dec 07, 2022 12:38:28 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 232 files from PipelineOptions.filesToStage to staging location
to prepare for execution.
Dec 07, 2022 12:38:28 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 232 files cached, 0 files newly uploaded in 0
seconds
Dec 07, 2022 12:38:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 07, 2022 12:38:28 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <118764 bytes, hash
dcb55ff0c31516d4f9dc54105f7cae1877eeca03eed14f1fb8dc84928641d9b4> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-3LVf8MMVFtT53FQQX3yuGHfuygPu0U8fuNyEkoZB2bQ.pb
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as
step s1
Dec 07, 2022 12:38:30 PM
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0,
endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000,
endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000,
endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000,
endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000,
endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000,
endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000,
endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000,
endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000,
endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000,
endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000,
endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000,
endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000,
endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000,
endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000,
endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000,
endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000,
endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000,
endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000,
endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000,
endOffset=20000000}]
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s6
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s7
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s8
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s9
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s10
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s11
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s12
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s13
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s14
Dec 07, 2022 12:38:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s15
Dec 07, 2022 12:38:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.45.0-SNAPSHOT
Dec 07, 2022 12:38:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-12-07_04_38_30-2984453022487971527?project=apache-beam-testing
Dec 07, 2022 12:38:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-12-07_04_38_30-2984453022487971527
Dec 07, 2022 12:38:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2022-12-07_04_38_30-2984453022487971527
Dec 07, 2022 12:38:34 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-12-07T12:38:34.852Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
load0tests0java110dataflow0v20streaming0pardo01-jenkins-12-b75f. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:42.543Z: Worker configuration: e2-standard-2 in
us-central1-b.
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:43.692Z: Expanding SplittableParDo operations into
optimizable parts.
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:43.732Z: Expanding CollectionToSingleton operations into
optimizable parts.
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:43.809Z: Expanding CoGroupByKey operations into
optimizable parts.
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:43.847Z: Expanding SplittableProcessKeyed operations
into optimizable parts.
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:43.876Z: Expanding GroupByKey operations into streaming
Read/Write steps
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:43.912Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:43.978Z: Fusing adjacent ParDo, Read, Write, and Flatten
operations
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.017Z: Fusing consumer Read
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read
input/Impulse
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.040Z: Fusing consumer
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.068Z: Fusing consumer
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
into
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.096Z: Fusing consumer Read
input/ParDo(StripIds)/ParMultiDo(StripIds) into
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.118Z: Fusing consumer
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read
input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.144Z: Fusing consumer
ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.184Z: Fusing consumer Step:
0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.208Z: Fusing consumer Step:
1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.242Z: Fusing consumer Step:
2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.271Z: Fusing consumer Step:
3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.311Z: Fusing consumer Step:
4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.334Z: Fusing consumer Step:
5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.364Z: Fusing consumer Step:
6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.395Z: Fusing consumer Step:
7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.420Z: Fusing consumer Step:
8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.461Z: Fusing consumer Step:
9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.491Z: Fusing consumer
ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step:
9/ParMultiDo(CounterOperation)
Dec 07, 2022 12:38:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:44.617Z: Running job using Streaming Engine
Dec 07, 2022 12:38:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:38:45.957Z: Starting 5 ****s in us-central1-b...
Dec 07, 2022 12:39:02 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:39:00.583Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 07, 2022 12:39:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:39:40.516Z: Autoscaling: Raised the number of ****s to 5 so
that the pipeline can catch up with its backlog and keep up with its input rate.
> Task :sdks:java:testing:load-tests:run FAILED
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=af08b6e7-2f19-49e0-8993-bf1eb0046cc8,
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 4194018
log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-4194018.out.log
----- Last 20 lines from daemon log file - daemon-4194018.out.log -----
INFO: 2022-12-07T12:38:45.957Z: Starting 5 ****s in us-central1-b...
Dec 07, 2022 12:39:02 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:39:00.583Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 07, 2022 12:39:42 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-07T12:39:40.516Z: Autoscaling: Raised the number of ****s to 5 so
that the pipeline can catch up with its backlog and keep up with its input rate.
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
Remove shutdown hook failed
java.lang.IllegalStateException: Shutdown in progress
at
java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82)
at java.lang.Runtime.removeShutdownHook(Runtime.java:231)
at
org.gradle.process.internal.shutdown.ShutdownHooks.removeShutdownHook(ShutdownHooks.java:38)
at
org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:208)
at
org.gradle.process.internal.DefaultExecHandle.failed(DefaultExecHandle.java:370)
at
org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:87)
at
org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]