See 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/1181/display/redirect?page=changes>

Changes:

[Maximilian Michels] [BEAM-10306] Add latency measurements to Python Flink 
ParDo load test

[Maximilian Michels] [BEAM-10306] Add latency measurements to Python Flink 
GroupByKey load

[noreply] [BEAM-9702] Update Java KinesisIO to support AWS SDK v2 (#11318)

[noreply] [BEAM-10559] Add some comments and clean up SQL example. (#12355)


------------------------------------------
[...truncated 237.48 KB...]

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' 
is df39cfbf7d99adc839aa1d47a971afbc
Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date 
because:
  Task.upToDateWhen is false.
Custom actions are attached to task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
Starting process 'Gradle Test Executor 2'. Working directory: 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=STREAMING_INSERTS","--writeFormat=JSON","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java_stream_0727174831","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_stream","--influxMeasurement=bqio_10GB_results_java_stream","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/5.2.1/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 2'
Successfully started process 'Gradle Test Executor 2'

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.25/bccda40ebc8067491b32a88f49615a747d20082d/slf4j-jdk14-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead 
STANDARD_ERROR
    Jul 27, 2020 6:34:09 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jul 27, 2020 6:34:10 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 198 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jul 27, 2020 6:34:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 199 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jul 27, 2020 6:34:12 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT-Rf9VGu8XeeRILVQHrGU3nee4ZbO2AR95QsFfwKswxgk.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.24.0-SNAPSHOT-IogEJ-O4qx7KGg-VkY2rq5b5VCYkGRYOfXk_EHGeZKQ.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-tests-0V1WAGgvJuaN6Cvq8cZlvRyeIgk19DtFquVdqFoSC9w.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-jV0OEK2u5zHy5WC86zX5N-Qlccc_Wp1IYWIQEqM98Ns.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.24.0-SNAPSHOT-tests-nGZBkIDdm5malnU7XxGQZNon4QCw2yuvHcjjNBnmNhM.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-INdEdSkjZREVeer9vLkT8YncPMEK--bQYb9UmX-S6WU.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.24.0-SNAPSHOT-5Ui3KCJgRN3MRJZaUB4w24xDnndBJuYmTL0eB_H52WA.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-8CE4RWDxcPpFkCcE4JkNOJp845X4khAKsS8xrizEU9A.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-GI3ekXz7K_f1vkMbojJQnbcTWZjusr6LLcsPCN8it4k.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/test5698517920374458095.zip to 
gs://temp-storage-for-perf-tests/loadtests/staging/test-kjiGQiozYZP_JijRyRhxfF5-R4kKEJB5ivnPpJlEVBk.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.24.0-SNAPSHOT-tests-nBPH-n7NuKKngHsBcE3XLSzA2m8YdSMCAk6H9atMyiY.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.24.0-SNAPSHOT-RsxGbav7fRHq8oXwsZEiqOfqURIWU8g7aZxX8X8xT24.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-NhWNj75QnlLgW_XIwSh0XUPNxyxL2E-saKblznNwL1E.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.24.0-SNAPSHOT-tests-9L07NDpJFl_wg6hjd_lhlH-aoyBYnk1QibD6tPau9t8.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.24.0-SNAPSHOT-15SfxXFC-dYVI-rJXM4noOVK8q8zDhvs6L6flv8JShk.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.24.0-SNAPSHOT-QEZ6-JP6hWBUcgZm9N2h1xIT06oXn-gTgeLMOSQGT1s.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.24.0-SNAPSHOT-kUtBG2StNDC3bD5upleeooG2ZP070zv2OFXq0RMJfwg.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-tests-vSfrhincYqOJVmFapU66di3UVfzU6JeXXG1du88ZEKQ.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/synthetic/build/libs/beam-sdks-java-io-synthetic-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-synthetic-2.24.0-SNAPSHOT-B3D7wI_sMjZQkCXHilj-qFMaL2yFdAx7Zhm63AkQsPM.jar
    Jul 27, 2020 6:34:12 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.24.0-SNAPSHOT-GX3don4emPRWP0dyHlJ4qanO0cVW3N6U-Tvy9PNvIcc.jar
    Jul 27, 2020 6:34:13 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.24.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.24.0-SNAPSHOT-tests-040VnZf7cqrRkjfsGhhzp4Gacdst-glxg4LUYjiT-kw.jar
    Jul 27, 2020 6:34:13 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.24.0-SNAPSHOT-r_B-t9TgemXMAC8ljHin9_7V56mf0QWhcI6aedM8vAE.jar
    Jul 27, 2020 6:34:13 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-****-2.24.0-SNAPSHOT-Rf9VGu8XeeRILVQHrGU3nee4ZbO2AR95QsFfwKswxgk.jar
    Jul 27, 2020 6:34:13 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-FNKTH-FMMbjJq763ODz_F41JaNCJwLKYW_-4vu1dVkg.jar
    Jul 27, 2020 6:34:13 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-yW5WTG99Xr2ZGSmpa8NG26ZgFNAQ8_5xXhiAB6nkZ4c.jar
    Jul 27, 2020 6:34:13 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-smh7x9nlNJS1PjnaDQ3fGbJCjahcRSElKcdkFZerxiA.jar
    Jul 27, 2020 6:34:13 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 174 files cached, 25 files newly uploaded in 
1 seconds
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from source as step s1
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Gather time as step s2
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records as step s3
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) 
as step s5
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
as step s8
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11
    Jul 27, 2020 6:34:14 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite as step s12
    Jul 27, 2020 6:34:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
    Jul 27, 2020 6:34:14 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <104152 bytes, hash 
354e2ef564d0b882083917f0cc75917e7bd86d0513db7d412ae7750159816c46> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-NU4u9WTQuIIIORfwzHWRfnvYbQUT231BKud1AVmBbEY.pb
    Jul 27, 2020 6:34:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
    Jul 27, 2020 6:34:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-27_11_34_14-15670036024384513042?project=apache-beam-testing
    Jul 27, 2020 6:34:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-07-27_11_34_14-15670036024384513042
    Jul 27, 2020 6:34:15 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2020-07-27_11_34_14-15670036024384513042
    Jul 27, 2020 6:34:16 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-27T18:34:14.512Z: The requested max number of ****s (5) is 
ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:23.896Z: Worker configuration: n1-standard-1 in 
us-central1-a.
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:24.648Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:24.747Z: Expanding GroupByKey operations into 
optimizable parts.
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:24.787Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:24.921Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:24.959Z: Fusing consumer Gather time into Read from 
source
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:24.994Z: Fusing consumer Map records into Gather time
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.018Z: Fusing consumer Write to 
BQ/PrepareWrite/ParDo(Anonymous) into Map records
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.051Z: Fusing consumer Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to 
BQ/PrepareWrite/ParDo(Anonymous)
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.077Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.110Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.134Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.176Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write 
to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.223Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.255Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow 
into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.282Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.320Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable
    Jul 27, 2020 6:34:25 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.356Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite into Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign
    Jul 27, 2020 6:34:26 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.830Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Jul 27, 2020 6:34:26 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.917Z: Starting 5 ****s in us-central1-a...
    Jul 27, 2020 6:34:26 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:25.977Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Jul 27, 2020 6:34:26 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:26.212Z: Executing operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Jul 27, 2020 6:34:38 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-07-27T18:34:38.208Z: Your project already contains 100 
Dataflow-created metric descriptors and Stackdriver will not create new 
Dataflow custom metrics for this job. Each unique user-defined metric name 
(independent of the DoFn in which it is defined) produces a new metric 
descriptor. To delete old / unused metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 27, 2020 6:34:53 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:52.465Z: Autoscaling: Raised the number of ****s to 
4 based on the rate of progress in the currently running stage(s).
    Jul 27, 2020 6:34:53 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:52.490Z: Resized **** pool to 4, though goal was 5.  
This could be a quota issue.
    Jul 27, 2020 6:34:59 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:34:57.897Z: Autoscaling: Raised the number of ****s to 
5 based on the rate of progress in the currently running stage(s).
    Jul 27, 2020 6:35:18 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:35:15.973Z: Workers have started successfully.
    Jul 27, 2020 6:35:18 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:35:16.015Z: Workers have started successfully.
    Jul 27, 2020 6:37:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:37:49.759Z: Finished operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Jul 27, 2020 6:37:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:37:49.853Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Jul 27, 2020 6:37:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:37:49.921Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Jul 27, 2020 6:37:52 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-07-27T18:37:50.008Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead SKIPPED

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Daemon **** Thread 
2,5,main]) completed. Took 4 mins 50.388 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 2' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 36s
84 actionable tasks: 57 executed, 27 from cache

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=6e47b318-49d5-4823-9e1b-172212aa949e, 
currentDir=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 23321
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-23321.out.log
----- Last  20 lines from daemon log file - daemon-23321.out.log -----
* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 2' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 36s
84 actionable tasks: 57 executed, 27 from cache

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to