See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/415/display/redirect?page=changes>
Changes: [vlad.matyunin] modifed WithKeys Playground Example [alexander.zhuravlev] [Playground] Removed banner from Playground header, deleted unused [shivam] Add example for `Distinct` PTransform [manitgupta] Fix bug in StructUtils [noreply] Add PyDoc buttons to the top and bottom of the Machine Learning page [noreply] [Playground][Backend][Bug]: Moving the initialization of properties file [noreply] Bump cloud.google.com/go/bigquery from 1.36.0 to 1.37.0 in /sdks [noreply] Minor: Clean up an assertion in schemas_test (#22613) [noreply] Exclude testWithShardedKeyInGlobalWindow on streaming runner v1 (#22593) [noreply] Pub/Sub Schema Transform Read Provider (#22145) [noreply] Update BigQuery URI validation to allow more valid URIs through (#22452) [noreply] Add units tests for SpannerIO (#22428) [noreply] Bump google.golang.org/api from 0.90.0 to 0.91.0 in /sdks (#22568) ------------------------------------------ [...truncated 47.60 KB...] Finished license_scripts.sh > Task :sdks:java:container:goPrepare UP-TO-DATE > Task :sdks:java:container:java11:copyJavaThirdPartyLicenses > Task :sdks:java:container:goBuild /home/jenkins/go/bin/go1.18.1 build -o ./build/target/linux_amd64/boot boot.go boot_test.go > Task :sdks:java:container:java11:copySdkHarnessLauncher Execution optimizations have been disabled for task ':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due to the following reasons: - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:downloadCloudProfilerAgent' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem. - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:pullLicenses' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem. > Task :sdks:java:container:java11:dockerPrepare > Task :sdks:java:container:java11:docker > Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03. As an alternative, use `gcloud auth configure-docker` to configure `docker` to use `gcloud` as a credential helper, then use `docker` as you would for non-GCR registries, e.g. `docker pull gcr.io/project-id/my-image`. Add `--verbosity=error` to silence this warning: `gcloud docker --verbosity=error -- pull gcr.io/project-id/my-image`. See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] 02ab14e4030a: Preparing aac51e31d6e3: Preparing 698c505a6dba: Preparing 76ba39ef7fc2: Preparing 13a097169a81: Preparing 4ab4c0ec365f: Preparing 0edc42324133: Preparing f076c4c8f17f: Preparing 0f8c3f7e6d62: Preparing 5381b0c9bad4: Preparing a5ce6013c38c: Preparing 55a40315e422: Preparing b067da221878: Preparing 142709f3aafc: Preparing 95c578ec6905: Preparing 7b7f3078e1db: Preparing 826c3ddbb29c: Preparing 4ab4c0ec365f: Waiting b626401ef603: Preparing 0edc42324133: Waiting 9b55156abf26: Preparing f076c4c8f17f: Waiting 293d5db30c9f: Preparing 03127cdb479b: Preparing 9c742cd6c7a5: Preparing 55a40315e422: Waiting 5381b0c9bad4: Waiting 7b7f3078e1db: Waiting 142709f3aafc: Waiting b067da221878: Waiting b626401ef603: Waiting 03127cdb479b: Waiting 293d5db30c9f: Waiting 9b55156abf26: Waiting aac51e31d6e3: Pushed 13a097169a81: Pushed 76ba39ef7fc2: Pushed 698c505a6dba: Pushed 02ab14e4030a: Pushed 0edc42324133: Pushed f076c4c8f17f: Pushed 5381b0c9bad4: Pushed 0f8c3f7e6d62: Pushed a5ce6013c38c: Pushed 4ab4c0ec365f: Pushed 55a40315e422: Pushed 7b7f3078e1db: Layer already exists 826c3ddbb29c: Layer already exists b626401ef603: Layer already exists 9b55156abf26: Layer already exists 293d5db30c9f: Layer already exists 03127cdb479b: Layer already exists 9c742cd6c7a5: Layer already exists 142709f3aafc: Pushed 95c578ec6905: Pushed b067da221878: Pushed 20220809123730: digest: sha256:edafb07a6af01797a11d1207d2aab35a7436e90bb1c3e32fa32a41a935f13a22 size: 4935 > Task :sdks:java:testing:load-tests:run Aug 09, 2022 12:38:44 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Aug 09, 2022 12:38:44 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged. Aug 09, 2022 12:38:45 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor) Aug 09, 2022 12:38:45 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Aug 09, 2022 12:38:47 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution. Aug 09, 2022 12:38:48 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0 seconds Aug 09, 2022 12:38:48 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Aug 09, 2022 12:38:48 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <115399 bytes, hash 0b5fe1708ea3739e6e8d70f1da5d6aa4a144de78181209ff362fadf52ae61000> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-C1_hcI6jc55ujXDx2l1qpKFE3ngYEgn_Ni-t9SrmEAA.pb Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Aug 09, 2022 12:38:50 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, endOffset=20000000}] Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor) as step s3 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(ByteMonitor) as step s4 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 0 as step s5 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 1 as step s6 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 2 as step s7 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 3 as step s8 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 4 as step s9 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 5 as step s10 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 6 as step s11 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 7 as step s12 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 8 as step s13 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 9 as step s14 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor)2 as step s15 Aug 09, 2022 12:38:50 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.42.0-SNAPSHOT Aug 09, 2022 12:38:51 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-08-09_05_38_50-5894889115984430578?project=apache-beam-testing Aug 09, 2022 12:38:51 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2022-08-09_05_38_50-5894889115984430578 Aug 09, 2022 12:38:51 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-08-09_05_38_50-5894889115984430578 Aug 09, 2022 12:38:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2022-08-09T12:38:54.631Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0pardo01-jenkins-08-ongu. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Aug 09, 2022 12:39:02 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:02.712Z: Worker configuration: e2-standard-2 in us-central1-a. Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.483Z: Expanding SplittableParDo operations into optimizable parts. Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.517Z: Expanding CollectionToSingleton operations into optimizable parts. Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.586Z: Expanding CoGroupByKey operations into optimizable parts. Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.617Z: Expanding SplittableProcessKeyed operations into optimizable parts. Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.649Z: Expanding GroupByKey operations into streaming Read/Write steps Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.688Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.756Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.795Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.827Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.849Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.882Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.915Z: Fusing consumer ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.942Z: Fusing consumer ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.966Z: Fusing consumer Step: 0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:03.992Z: Fusing consumer Step: 1/ParMultiDo(CounterOperation) into Step: 0/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.023Z: Fusing consumer Step: 2/ParMultiDo(CounterOperation) into Step: 1/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.057Z: Fusing consumer Step: 3/ParMultiDo(CounterOperation) into Step: 2/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:04 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.092Z: Fusing consumer Step: 4/ParMultiDo(CounterOperation) into Step: 3/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.124Z: Fusing consumer Step: 5/ParMultiDo(CounterOperation) into Step: 4/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.155Z: Fusing consumer Step: 6/ParMultiDo(CounterOperation) into Step: 5/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.180Z: Fusing consumer Step: 7/ParMultiDo(CounterOperation) into Step: 6/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.248Z: Fusing consumer Step: 8/ParMultiDo(CounterOperation) into Step: 7/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.280Z: Fusing consumer Step: 9/ParMultiDo(CounterOperation) into Step: 8/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.317Z: Fusing consumer ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 9/ParMultiDo(CounterOperation) Aug 09, 2022 12:39:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.448Z: Running job using Streaming Engine Aug 09, 2022 12:39:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:04.686Z: Starting 5 ****s in us-central1-a... Aug 09, 2022 12:39:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:19.401Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Aug 09, 2022 12:39:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:42.412Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. > Task :sdks:java:testing:load-tests:run FAILED The message received from the daemon indicates that the daemon has disappeared. Build request sent: Build{id=a642616a-d569-445e-b37e-4f98645f1b85, currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src}> Attempting to read last messages from the daemon log... Daemon pid: 934176 log file: /home/jenkins/.gradle/daemon/7.4/daemon-934176.out.log ----- Last 20 lines from daemon log file - daemon-934176.out.log ----- Aug 09, 2022 12:39:20 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:19.401Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Aug 09, 2022 12:39:43 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-08-09T12:39:42.412Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt. Remove shutdown hook failed java.lang.IllegalStateException: Shutdown in progress at java.lang.ApplicationShutdownHooks.remove(ApplicationShutdownHooks.java:82) at java.lang.Runtime.removeShutdownHook(Runtime.java:239) at org.gradle.process.internal.shutdown.ShutdownHooks.removeShutdownHook(ShutdownHooks.java:38) at org.gradle.process.internal.DefaultExecHandle.setEndStateInfo(DefaultExecHandle.java:208) at org.gradle.process.internal.DefaultExecHandle.aborted(DefaultExecHandle.java:365) at org.gradle.process.internal.ExecHandleRunner.completed(ExecHandleRunner.java:108) at org.gradle.process.internal.ExecHandleRunner.run(ExecHandleRunner.java:84) at org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42) at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64) at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) ----- End of the daemon log ----- FAILURE: Build failed with an exception. * What went wrong: Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed) * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org > Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
