See
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/2211/display/redirect?page=changes>
Changes:
[Boyuan Zhang] Change PubSubSource and PubSubSink translation to avoid special
[Andrew Pilloud] Complex Type Passthrough Test
[Andrew Pilloud] Don't use base types in BeamCalcRel
[Andrew Pilloud] Use correct schema geters, enforce types
[Andrew Pilloud] Rename functions, add comments
[noreply] [BEAM-12112] Disable streaming mode for PORTABILITY_BATCH (#14452)
[noreply] [BEAM-9547] Implementations for a few more DataFrame operations
(#14362)
------------------------------------------
[...truncated 269.61 KB...]
Watching 1308 directories to track changes
Watching 1309 directories to track changes
Loaded cache entry for task
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' with cache key
b97509f4f1ef18d76f9b1f8386c62bfd
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar
(Thread[Execution **** for ':' Thread 5,5,main]) completed. Took 0.188 secs.
Watching 1316 directories to track changes
> Task :sdks:java:io:bigquery-io-perf-tests:compileTestJava FROM-CACHE
Watching 1307 directories to track changes
Watching 1307 directories to track changes
Watching 1307 directories to track changes
Watching 1307 directories to track changes
Watching 1307 directories to track changes
Watching 1316 directories to track changes
Custom actions are attached to task
':sdks:java:io:bigquery-io-perf-tests:compileTestJava'.
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava'
is 2d3adb9b029d21411a3b24a9deb0270d
Task ':sdks:java:io:bigquery-io-perf-tests:compileTestJava' is not up-to-date
because:
No history is available.
Watching 1316 directories to track changes
Watching 1316 directories to track changes
Watching 1316 directories to track changes
Watching 1323 directories to track changes
Watching 1325 directories to track changes
Watching 1327 directories to track changes
Loaded cache entry for task
':sdks:java:io:bigquery-io-perf-tests:compileTestJava' with cache key
2d3adb9b029d21411a3b24a9deb0270d
:sdks:java:io:bigquery-io-perf-tests:compileTestJava (Thread[Execution **** for
':' Thread 4,5,main]) completed. Took 0.227 secs.
:sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Execution **** for ':'
Thread 4,5,main]) started.
> Task :sdks:java:io:bigquery-io-perf-tests:testClasses UP-TO-DATE
Skipping task ':sdks:java:io:bigquery-io-perf-tests:testClasses' as it has no
actions.
:sdks:java:io:bigquery-io-perf-tests:testClasses (Thread[Execution **** for ':'
Thread 4,5,main]) completed. Took 0.0 secs.
> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Watching 1306 directories to track changes
Watching 1306 directories to track changes
Watching 1306 directories to track changes
Watching 1337 directories to track changes
Custom actions are attached to task
':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava'
is cd78a6a8529a9c4f683042be1a28bb8d
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date
because:
No history is available.
Watching 1337 directories to track changes
Watching 1337 directories to track changes
Watching 1337 directories to track changes
Watching 1348 directories to track changes
Watching 1349 directories to track changes
Watching 1350 directories to track changes
Loaded cache entry for task
':runners:google-cloud-dataflow-java:compileTestJava' with cache key
cd78a6a8529a9c4f683042be1a28bb8d
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for
':' Thread 8,5,main]) completed. Took 0.292 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':'
Thread 8,5,main]) started.
> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':'
Thread 8,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':'
Thread 8,5,main]) started.
> Task :runners:google-cloud-dataflow-java:testJar
Watching 1350 directories to track changes
Watching 1350 directories to track changes
Watching 1351 directories to track changes
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
No history is available.
Watching 1351 directories to track changes
file or directory
'/home/jenkins/jenkins-slave/workspace/beam_BiqQueryIO_Streaming_Performance_Test_Java/src/runners/google-cloud-dataflow-java/build/resources/test',
not found
Watching 1351 directories to track changes
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':'
Thread 8,5,main]) completed. Took 0.031 secs.
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for
':' Thread 8,5,main]) started.
Gradle Test Executor 1 started executing tests.
> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest
Watching 1351 directories to track changes
Watching 1351 directories to track changes
Watching 1351 directories to track changes
Watching 1351 directories to track changes
Watching 1351 directories to track changes
Custom actions are attached to task
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'
is 9399ea1528ab96bc62f2c23c58a03006
Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date
because:
Task.upToDateWhen is false.
Watching 1351 directories to track changes
Watching 1351 directories to track changes
Starting process 'Gradle Test Executor 1'. Working directory:
/home/jenkins/jenkins-slave/workspace/beam_BiqQueryIO_Streaming_Performance_Test_Java/src/sdks/java/io/bigquery-io-perf-tests
Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java
-DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=STREAMING_INSERTS","--writeFormat=JSON","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java_stream_0409002212","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_stream","--influxMeasurement=bqio_10GB_results_java_stream","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--****HarnessContainerImage=","--dataflowWorkerJar=/home/jenkins/jenkins-slave/workspace/beam_BiqQueryIO_Streaming_Performance_Test_Java/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.30.0-SNAPSHOT.jar","--region=us-central1"]
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
-Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US
-Duser.language=en -Duser.variant -ea -cp
/home/jenkins/.gradle/caches/6.8/****Main/gradle-****.jar
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/home/jenkins/jenkins-slave/workspace/beam_BiqQueryIO_Streaming_Performance_Test_Java/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.30.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead
STANDARD_ERROR
Apr 09, 2021 12:33:52 AM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Apr 09, 2021 12:33:53 AM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 207 files. Enable logging at DEBUG level to see
which files will be staged.
Apr 09, 2021 12:33:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Apr 09, 2021 12:33:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
Apr 09, 2021 12:33:57 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <136591 bytes, hash
f9dca725062290c043ec9925181be80d048c4f21f194f824193c9446515b9620> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline--dynJQYikMBD7JklGBvoDQSMTyHxlPgkGTyURlFbliA.pb
Apr 09, 2021 12:33:58 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 208 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Apr 09, 2021 12:33:58 AM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.30.0-SNAPSHOT-D6xFkBVxg1HRue-hDzpDQZrAOoUBq-VdpMPQC4YgTOw.jar
Apr 09, 2021 12:33:58 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading /tmp/test3315013610480949169.zip to
gs://temp-storage-for-perf-tests/loadtests/staging/test-nv2V-ftt3CWF14YnU_u90tQO-Bk-zHCei6f0gSqZlkU.jar
Apr 09, 2021 12:33:59 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 207 files cached, 1 files newly uploaded in 0
seconds
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from source as step s1
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Gather time as step s2
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Map records as step s3
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
as step s5
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
as step s8
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/StripShardId/Map as step s12
Apr 09, 2021 12:33:59 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite/BatchedStreamingWrite.ViaBundleFinalization/ParMultiDo(BatchAndInsertElements)
as step s13
Apr 09, 2021 12:33:59 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.30.0-SNAPSHOT
Apr 09, 2021 12:34:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-08_17_33_59-10549752733372826918?project=apache-beam-testing
Apr 09, 2021 12:34:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-04-08_17_33_59-10549752733372826918
Apr 09, 2021 12:34:00 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2021-04-08_17_33_59-10549752733372826918
Apr 09, 2021 12:34:04 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-04-09T00:34:02.749Z: The requested max number of ****s (5) is
ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
Apr 09, 2021 12:34:10 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:10.160Z: Worker configuration: n1-standard-1 in
us-central1-f.
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:10.848Z: Expanding CoGroupByKey operations into
optimizable parts.
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:10.935Z: Expanding GroupByKey operations into
optimizable parts.
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:10.965Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.070Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.100Z: Fusing consumer Gather time into Read from
source
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.131Z: Fusing consumer Map records into Gather time
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.154Z: Fusing consumer Write to
BQ/PrepareWrite/ParDo(Anonymous) into Map records
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.178Z: Fusing consumer Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to
BQ/PrepareWrite/ParDo(Anonymous)
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.212Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.236Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.257Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.318Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write
to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.341Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.373Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.409Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.438Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.471Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/StripShardId/Map into Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.493Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite/BatchedStreamingWrite.ViaBundleFinalization/ParMultiDo(BatchAndInsertElements)
into Write to BQ/StreamingInserts/StreamingWriteTables/StripShardId/Map
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.818Z: Executing operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:11.903Z: Starting 5 ****s in us-central1-f...
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:12.299Z: Finished operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
Apr 09, 2021 12:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:12.425Z: Executing operation Read from source+Gather
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
Apr 09, 2021 12:34:25 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:25.089Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Apr 09, 2021 12:34:58 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:34:57.847Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Apr 09, 2021 12:35:24 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:35:23.042Z: Workers have started successfully.
Apr 09, 2021 12:35:24 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:35:23.060Z: Workers have started successfully.
Apr 09, 2021 12:36:22 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:36:19.898Z: Finished operation Read from source+Gather
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
Apr 09, 2021 12:36:22 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:36:20.020Z: Executing operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
Apr 09, 2021 12:36:22 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:36:20.075Z: Finished operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
Apr 09, 2021 12:36:22 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:36:20.143Z: Executing operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to
BQ/StreamingInserts/StreamingWriteTables/StripShardId/Map+Write to
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite/BatchedStreamingWrite.ViaBundleFinalization/ParMultiDo(BatchAndInsertElements)
Apr 09, 2021 12:52:07 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:52:05.953Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Apr 09, 2021 12:55:05 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:55:04.487Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Apr 09, 2021 12:57:08 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-04-09T00:57:07.996Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-11' is disconnected.
at
hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:216)
at
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
at com.sun.proxy.$Proxy140.isAlive(Unknown Source)
at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1211)
at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1203)
at hudson.Launcher$ProcStarter.join(Launcher.java:523)
at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:804)
at hudson.model.Build$BuildExecution.build(Build.java:197)
at hudson.model.Build$BuildExecution.doRun(Build.java:163)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:514)
at hudson.model.Run.execute(Run.java:1907)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: hudson.remoting.Channel$OrderlyShutdown: Command Close created at
at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1322)
at hudson.remoting.Channel$1.handle(Channel.java:607)
at
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:85)
Caused by: Command Close created at
at hudson.remoting.Command.<init>(Command.java:70)
at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1315)
at hudson.remoting.Channel$CloseCommand.<init>(Channel.java:1313)
at hudson.remoting.Channel.close(Channel.java:1488)
at hudson.remoting.Channel.close(Channel.java:1455)
at hudson.remoting.Channel$CloseCommand.execute(Channel.java:1321)
... 2 more
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-11 is offline; cannot locate jdk_1.8_latest
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]