See
<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Batch_Java17/356/display/redirect?page=changes>
Changes:
[chamikaramj] Fixing a breakage of multi-lang auto Runner v2 enabling
[noreply] [Playground] support for Kafka-enabled examples (#24459)
[noreply] Fix some small notebook typos (#24616)
[noreply] initialize and increment metrics properly (#24592)
[noreply] Add schema conversion support from Kafka Connect Record schemas to
Beam
[noreply] interface{} -> any for starcgen (#24618)
[noreply] interface{} -> any for remaining references (#24625)
[noreply] Updating issue-tagger Workflow (#171) (#23143)
[noreply] [GitHub Actions] - Updates in Build Playground Backend to runs-on
[noreply] [GitHub Actions] - Updates in Build Playground Frontend to runs-on
[noreply] [GitHub Actions] - Updates in Go Tests to runs-on Self-hosted runners
[noreply] [GitHub Actions] - Updates in Java Tests to runs-on Self-hosted
runners
[noreply] Updated label_prs workflow (#173) (#23145)
[noreply] [CdapIO] CdapIO and SparkReceiverIO updates (#24436)
[noreply] Revert "[GitHub Actions] - Updates in Java Tests to runs-on
Self-hosted
[noreply] Disallow sliding windows with combiner fanout to prevent data loss
------------------------------------------
[...truncated 244.84 KB...]
> Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE
> Task :sdks:java:container:java17:copySdkHarnessLauncher
Execution optimizations have been disabled for task
':sdks:java:container:java17:copySdkHarnessLauncher' to ensure correctness due
to the following reasons:
- Gradle detected a problem with the following location:
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Batch_Java17/ws/src/sdks/java/container/build/target'.>
Reason: Task ':sdks:java:container:java17:copySdkHarnessLauncher' uses this
output of task ':sdks:java:container:downloadCloudProfilerAgent' without
declaring an explicit or implicit dependency. This can lead to incorrect
results being produced, depending on what order the tasks are executed. Please
refer to
https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency
for more details about this problem.
- Gradle detected a problem with the following location:
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Batch_Java17/ws/src/sdks/java/container/build/target'.>
Reason: Task ':sdks:java:container:java17:copySdkHarnessLauncher' uses this
output of task ':sdks:java:container:pullLicenses' without declaring an
explicit or implicit dependency. This can lead to incorrect results being
produced, depending on what order the tasks are executed. Please refer to
https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency
for more details about this problem.
> Task :sdks:java:container:java17:dockerPrepare
> Task :sdks:java:container:java17:docker
> Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above
18.03.
As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.
See:
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker
The push refers to repository
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
b7784b0a6d0b: Preparing
2cf7cc4d8f9f: Preparing
49b6df63767a: Preparing
0894cc0a0ed4: Preparing
fe8abfec7e40: Preparing
30002b8c172f: Preparing
8fce8b60e97c: Preparing
81787077a8df: Preparing
a7e6455318e4: Preparing
f38b3237dc51: Preparing
efaba40366a5: Preparing
d80ac9d885af: Preparing
d9b116a5bee1: Preparing
4ea9a3f95630: Preparing
a3c57f907c82: Preparing
3bc383470c05: Preparing
e93827457889: Preparing
08fa02ce37eb: Preparing
a037458de4e0: Preparing
bafdbe68e4ae: Preparing
a13c519c6361: Preparing
e93827457889: Waiting
08fa02ce37eb: Waiting
efaba40366a5: Waiting
a037458de4e0: Waiting
bafdbe68e4ae: Waiting
a13c519c6361: Waiting
d80ac9d885af: Waiting
8fce8b60e97c: Waiting
81787077a8df: Waiting
d9b116a5bee1: Waiting
4ea9a3f95630: Waiting
a7e6455318e4: Waiting
a3c57f907c82: Waiting
3bc383470c05: Waiting
30002b8c172f: Waiting
0894cc0a0ed4: Pushed
fe8abfec7e40: Pushed
2cf7cc4d8f9f: Pushed
49b6df63767a: Pushed
b7784b0a6d0b: Pushed
81787077a8df: Pushed
8fce8b60e97c: Pushed
f38b3237dc51: Pushed
a7e6455318e4: Pushed
efaba40366a5: Pushed
30002b8c172f: Pushed
3bc383470c05: Layer already exists
e93827457889: Layer already exists
08fa02ce37eb: Layer already exists
a037458de4e0: Layer already exists
bafdbe68e4ae: Layer already exists
a13c519c6361: Layer already exists
d9b116a5bee1: Pushed
d80ac9d885af: Pushed
a3c57f907c82: Pushed
4ea9a3f95630: Pushed
20221210150740: digest:
sha256:e1ca4eb708daf10452473a733687376054c6e5c2c47299efc5fb3e7ba94cff1a size:
4728
> Task :sdks:java:testing:load-tests:run
Dec 10, 2022 3:08:05 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 10, 2022 3:08:06 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage 232 files. Enable logging at DEBUG level to see which
files will be staged.
Dec 10, 2022 3:08:06 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Window.Into()
Dec 10, 2022 3:08:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Dec 10, 2022 3:08:09 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 232 files from PipelineOptions.filesToStage to staging location
to prepare for execution.
Dec 10, 2022 3:08:09 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 232 files cached, 0 files newly uploaded in 0
seconds
Dec 10, 2022 3:08:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 10, 2022 3:08:09 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <118237 bytes, hash
717c114a3f53ccd99355744bb5eb1753bac1ceed670d2a22d04b959f1529438e> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-cXwRSj9TzNmTVXRLtesXU7rBzu1nDSoi0EuVnxUpQ44.pb
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input as step s1
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (input) as step s2
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s3
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read co-input as step s4
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics (co-input) as step s5
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()2/Window.Assign as step s6
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable0 as step s7
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/MakeUnionTable1 as step s8
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/Flatten as step s9
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/GBK as step s10
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s11
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate as step s12
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect total bytes as step s13
Dec 10, 2022 3:08:11 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics as step s14
Dec 10, 2022 3:08:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.45.0-SNAPSHOT
Dec 10, 2022 3:08:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-12-10_07_08_11-13453478924194995752?project=apache-beam-testing
Dec 10, 2022 3:08:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-12-10_07_08_11-13453478924194995752
Dec 10, 2022 3:08:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2022-12-10_07_08_11-13453478924194995752
Dec 10, 2022 3:08:14 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-12-10T15:08:13.885Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
load0tests0java170dataflow0v20batch0cogbk04-jenkins-121015-razq. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 10, 2022 3:08:21 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:20.739Z: Worker configuration: e2-standard-2 in
us-central1-b.
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:21.967Z: Expanding SplittableParDo operations into
optimizable parts.
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:21.997Z: Expanding CollectionToSingleton operations into
optimizable parts.
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.054Z: Expanding CoGroupByKey operations into
optimizable parts.
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.106Z: Expanding GroupByKey operations into
optimizable parts.
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.157Z: Fusing adjacent ParDo, Read, Write, and Flatten
operations
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.183Z: Unzipping flatten CoGroupByKey-Flatten for
input CoGroupByKey-MakeUnionTable0-ParMultiDo-ConstructUnionTable-.output
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.216Z: Fusing unzipped copy of CoGroupByKey/GBK/Write,
through flatten CoGroupByKey/Flatten, into producer
CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable)
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.254Z: Fusing consumer
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult) into
CoGroupByKey/GBK/Read
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.286Z: Fusing consumer Ungroup and
reiterate/ParMultiDo(UngroupAndReiterate) into
CoGroupByKey/ConstructCoGbkResultFn/ParMultiDo(ConstructCoGbkResult)
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.392Z: Fusing consumer Collect total
bytes/ParMultiDo(ByteMonitor) into Ungroup and
reiterate/ParMultiDo(UngroupAndReiterate)
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.418Z: Fusing consumer Collect end time
metrics/ParMultiDo(TimeMonitor) into Collect total bytes/ParMultiDo(ByteMonitor)
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.452Z: Fusing consumer CoGroupByKey/GBK/Write into
CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable)
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.484Z: Fusing consumer Read
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read
input/Impulse
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.519Z: Fusing consumer
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction
into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.542Z: Fusing consumer
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
into
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.564Z: Fusing consumer Collect start time metrics
(input)/ParMultiDo(TimeMonitor) into
Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.598Z: Fusing consumer Window.Into()/Window.Assign
into Collect start time metrics (input)/ParMultiDo(TimeMonitor)
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.631Z: Fusing consumer
CoGroupByKey/MakeUnionTable0/ParMultiDo(ConstructUnionTable) into
Window.Into()/Window.Assign
Dec 10, 2022 3:08:22 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.659Z: Fusing consumer Read
co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read
co-input/Impulse
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.695Z: Fusing consumer
Read-co-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction
into Read co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.729Z: Fusing consumer
Read-co-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
into
Read-co-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.754Z: Fusing consumer Collect start time metrics
(co-input)/ParMultiDo(TimeMonitor) into
Read-co-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.784Z: Fusing consumer Window.Into()2/Window.Assign
into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.813Z: Fusing consumer
CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into
Window.Into()2/Window.Assign
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:23.133Z: Executing operation Read co-input/Impulse+Read
co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Read-co-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction+Read-co-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:23.166Z: Executing operation Read input/Impulse+Read
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:23.204Z: Starting 5 ****s in us-central1-b...
Dec 10, 2022 3:08:38 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:36.835Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 10, 2022 3:09:08 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:09:06.858Z: Autoscaling: Raised the number of ****s to 5
based on the rate of progress in the currently running stage(s).
Dec 10, 2022 3:10:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob
lambda$waitUntilFinish$0
WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not
cancel it.
To cancel the job in the cloud, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2022-12-10_07_08_11-13453478924194995752
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=1d703840-37f8-40d0-89e9-9833d12bb796,
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_V2_Batch_Java17/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 2787253
log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-2787253.out.log
----- Last 20 lines from daemon log file - daemon-2787253.out.log -----
INFO: 2022-12-10T15:08:22.754Z: Fusing consumer Collect start time metrics
(co-input)/ParMultiDo(TimeMonitor) into
Read-co-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.784Z: Fusing consumer Window.Into()2/Window.Assign
into Collect start time metrics (co-input)/ParMultiDo(TimeMonitor)
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:22.813Z: Fusing consumer
CoGroupByKey/MakeUnionTable1/ParMultiDo(ConstructUnionTable) into
Window.Into()2/Window.Assign
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:23.133Z: Executing operation Read co-input/Impulse+Read
co-input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Read-co-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction+Read-co-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:23.166Z: Executing operation Read input/Impulse+Read
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/PairWithRestriction+Read-input-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/SplitWithSizing
Dec 10, 2022 3:08:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:23.204Z: Starting 5 ****s in us-central1-b...
Dec 10, 2022 3:08:38 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:08:36.835Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 10, 2022 3:09:08 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-10T15:09:06.858Z: Autoscaling: Raised the number of ****s to 5
based on the rate of progress in the currently running stage(s).
Dec 10, 2022 3:10:02 PM org.apache.beam.runners.dataflow.DataflowPipelineJob
lambda$waitUntilFinish$0
WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not
cancel it.
To cancel the job in the cloud, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2022-12-10_07_08_11-13453478924194995752
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]