See 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/1191/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Refactor Beam SQL QueryPlanner instantiation to be more type 
safe and

[Kenneth Knowles] Move individual spotbug suppressions out of 
spotbugs-filter.xml and

[Kenneth Knowles] [BEAM-10575] Fix rawtypes in FileSystemRegistrar and 
implementations

[Kenneth Knowles] [BEAM-10556] Eliminate remaining trivial rawtypes in

[Kenneth Knowles] [BEAM-10577] Eliminate remaining trivial rawtypes in 
sdks/java/io/azure

[Kenneth Knowles] [BEAM-10575] Eliminate rawtypes from GCP IOs and enable 
-Wrawtypes


------------------------------------------
[...truncated 237.15 KB...]

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' 
is ae1276dc88f66f6a1ae7c8439c323a00
Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date 
because:
  Task.upToDateWhen is false.
Custom actions are attached to task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
Starting process 'Gradle Test Executor 5'. Working directory: 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=STREAMING_INSERTS","--writeFormat=JSON","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java_stream_0730050219","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_stream","--influxMeasurement=bqio_10GB_results_java_stream","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/5.2.1/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 5'
Successfully started process 'Gradle Test Executor 5'

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead 
STANDARD_ERROR
    Jul 30, 2020 6:34:02 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jul 30, 2020 6:34:02 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 198 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jul 30, 2020 6:34:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jul 30, 2020 6:34:04 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 199 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/test7384111643894429856.zip to 
gs://temp-storage-for-perf-tests/loadtests/staging/test-kOdKJe9gKISrEo9pKIdxIhjRVbPmFVPxywZtYI2z2A0.jar
    Jul 30, 2020 6:34:05 AM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT-9PpQK2E9f2pyDdVWQGI6y25D8QsLB2Lf8pb3VEp5yeM.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-uP89hXA2QDC3ILo71hHO0v2cJ02cYRIfluPROrgAoI4.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-GnMo-fr4qIBA9WnGu0_MDElGypyI7A9iveEDay3NEjQ.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-xEv_zHBxqlEyZ60uFaaxigdzNYMEafe8jTnh-1JlWKk.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-f_KKDp6VIcP53HPRyoj8VU3BnAdcQcVuo-fxpb4-MOc.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-04Hr0Sw-AXEtCpaB2qP5lPIB0gvV8XVIQy8l-H61b8w.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/synthetic/build/libs/beam-sdks-java-io-synthetic-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-synthetic-2.24.0-SNAPSHOT-4WT2HMBXdZpfLV_-vVdCJnfurxQBO1T2vP1v4lm4y3A.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-8yQjMNQ5bRiR5P1cxZa7Mbo3ab1zjfhPQNs9GIXVn0s.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-SWlAL0JDFS6WlXK0yOjfXfIqsBKzga2kaREw7USiyBM.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-tSAWk5Na9sj3jXYs_IrPS40y6bT69Gz7l-HxJ_xpCW8.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-r7gkKv2wMv64qUh_NQ-TjSTsYECtEaq_y5M8dPhGXPE.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-EB0G2V3BR6bud6HjSHGfhjvJQivoxYzGiydBww2dhzg.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-6KjA1H1qMgGbgEfsYAV_fH7v50aUqciRdOTsK4NzxvE.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT-9PpQK2E9f2pyDdVWQGI6y25D8QsLB2Lf8pb3VEp5yeM.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-XIIpeRWSXVuWlAAksktEa7reYecTCa7Qor8IJQHkT_M.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-LntPDhLQg3uFy3kkJxwQaN4Z9ttuFnb5Y_Fc79sNmJ8.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-4YjUWwpIrR2sb6eYa5FtmDTyrUdnQjP-DLVhujvh4_4.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-XQvHR4wCv3O2w3s0xHOlALPwM6c4cz9LOGjVFZR6AgQ.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-lxwvqYrdxe9AN6aYVMfIMKFptE9eItSXS8jzXS0qqFs.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-t-rbUC9cMo9Y6s7LbEXTs60cGzPCHuyQUcR3KEvTRjg.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-RzWOq3Lysl9SluZueOxIj6TMy7ynGSA7aVQC-Q6hcHs.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-rXBy8N4N57gMrQ2pfRHxB7_rMC954Y7f4SWJhn6s5h8.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-zAt5fDS34mzuDYw9j9fRHvHaZ1QDj4nGiL8G_MIrd7w.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-rdnkdTTsKpYKO-GgN0b3gB1VtM7VdZVrEt8E6skYdrI.jar
    Jul 30, 2020 6:34:05 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-gc1erH3ICPmDDK3ufGyXhRlLTMvXAQ7J03E_JpxpNhc.jar
    Jul 30, 2020 6:34:06 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 174 files cached, 25 files newly uploaded in 
1 seconds
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from source as step s1
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Gather time as step s2
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records as step s3
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) 
as step s5
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
as step s8
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11
    Jul 30, 2020 6:34:06 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite as step s12
    Jul 30, 2020 6:34:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
    Jul 30, 2020 6:34:06 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <104153 bytes, hash 
4205dc643b7bcba8d4b5c86d65ee1d306b4ddccd33cb4a9acf9f0e072c77f010> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-QgXcZDt7y6jUtchtZe4dMGtN3M0zy0qaz58OByx38BA.pb
    Jul 30, 2020 6:34:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Jul 30, 2020 6:34:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-29_23_34_06-10068652496037931319?project=apache-beam-testing
    Jul 30, 2020 6:34:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-29_23_34_06-10068652496037931319
    Jul 30, 2020 6:34:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2020-07-29_23_34_06-10068652496037931319
    Jul 30, 2020 6:34:08 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-30T06:34:07.020Z: The requested max number of ****s (5) is 
ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:14.414Z: Worker configuration: n1-standard-1 in 
us-central1-a.
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.150Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.226Z: Expanding GroupByKey operations into 
optimizable parts.
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.255Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.349Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.373Z: Fusing consumer Gather time into Read from 
source
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.411Z: Fusing consumer Map records into Gather time
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.448Z: Fusing consumer Write to 
BQ/PrepareWrite/ParDo(Anonymous) into Map records
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.476Z: Fusing consumer Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to 
BQ/PrepareWrite/ParDo(Anonymous)
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.527Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.563Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.652Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.688Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write 
to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.717Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.755Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow 
into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.787Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.834Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:15.865Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite into Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:16.228Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:16.309Z: Starting 5 ****s in us-central1-a...
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:16.342Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Jul 30, 2020 6:34:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:16.470Z: Executing operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Jul 30, 2020 6:34:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-30T06:34:32.028Z: Your project already contains 100 
Dataflow-created metric descriptors and Stackdriver will not create new 
Dataflow custom metrics for this job. Each unique user-defined metric name 
(independent of the DoFn in which it is defined) produces a new metric 
descriptor. To delete old / unused metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 30, 2020 6:34:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:34:41.344Z: Autoscaling: Raised the number of ****s to 
5 based on the rate of progress in the currently running stage(s).
    Jul 30, 2020 6:35:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:35:02.317Z: Workers have started successfully.
    Jul 30, 2020 6:35:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:35:02.343Z: Workers have started successfully.
    Jul 30, 2020 6:37:35 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:37:34.300Z: Finished operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Jul 30, 2020 6:37:35 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:37:34.361Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Jul 30, 2020 6:37:35 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:37:34.414Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Jul 30, 2020 6:37:35 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-30T06:37:34.476Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite
    Jul 30, 2020 6:44:23 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob lambda$waitUntilFinish$0
    WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not 
cancel it.
    To cancel the job in the cloud, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2020-07-29_23_34_06-10068652496037931319

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead SKIPPED

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for 
':' Thread 10,5,main]) completed. Took 10 mins 25.965 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 5' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 3s
84 actionable tasks: 55 executed, 29 from cache

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=b95d63f6-9358-449f-8aa0-ac0063cd31b0, 
currentDir=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 7612
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-7612.out.log
----- Last  20 lines from daemon log file - daemon-7612.out.log -----
* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 5' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 3s
84 actionable tasks: 55 executed, 29 from cache

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to