See 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Batch_Performance_Test_Java_Avro/1813/display/redirect?page=changes>

Changes:

[chuck.yang] Pass str rather than TableReference

[Brian Hulette] bump worker to 20210301

[Andrew Pilloud] [BEAM-9379] Update vendored Calcite to 1.26.0

[Andrew Pilloud] [BEAM-9379] Fix linkage issues

[shehzaad] upgrade errorprone version to 2.3.2

[shehzaad] upgrade to 2.3.4 due to

[shehzaad] suppress new (post 2.3.1) errorprone patterns

[tysonjh] Update dataflow client.

[tysonjh] [BEAM-11932] Add Dataflow service options.

[Chamikara Madhusanka Jayalath] Updates Dataflow client

[noreply] Use errorprone_version instead of hardcoding.

[tysonjh] [BEAM-11932] Add Dataflow ServiceOptions.


------------------------------------------
[...truncated 328.06 KB...]
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for 
':' Thread 10,5,main]) started.
Gradle Test Executor 13 started executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest
Watching 1348 directories to track changes
Watching 1348 directories to track changes
Watching 1348 directories to track changes
Watching 1348 directories to track changes
Watching 1348 directories to track changes
Custom actions are attached to task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' 
is 18bc08f81b5f1def35011f40bdb21e21
Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date 
because:
  Task.upToDateWhen is false.
Watching 1348 directories to track changes
Watching 1348 directories to track changes
Starting process 'Gradle Test Executor 13'. Working directory: 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Batch_Performance_Test_Java_Avro/ws/src/sdks/java/io/bigquery-io-perf-tests>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=FILE_LOADS","--writeFormat=AVRO","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java_avro_0309002542","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_batch_avro","--influxMeasurement=bqio_10GB_results_java_batch_avro","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Batch_Performance_Test_Java_Avro/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/6.8/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 13'
Successfully started process 'Gradle Test Executor 13'

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_BiqQueryIO_Batch_Performance_Test_Java_Avro/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead 
STANDARD_ERROR
    Mar 09, 2021 12:44:37 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Mar 09, 2021 12:44:38 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 206 files. Enable logging at DEBUG level to see 
which files will be staged.
    Mar 09, 2021 12:44:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Mar 09, 2021 12:44:42 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 207 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Mar 09, 2021 12:44:42 AM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.29.0-SNAPSHOT-KSU6Kw39QqtaisVNYcVPQuSS5YzijYv_tH4d0dd8Amc.jar
    Mar 09, 2021 12:44:42 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/test7055315695209109166.zip to 
gs://temp-storage-for-perf-tests/loadtests/staging/test-N9H__BJp_wVM99ajxKCjU6kwnK7-FCnNjwHnbSb-Yw8.jar
    Mar 09, 2021 12:44:43 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 206 files cached, 1 files newly uploaded in 0 
seconds
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from source as step s1
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Gather time as step s2
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records as step s3
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/JobIdCreationRoot_LOAD/Read(CreateSource) as step s5
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/CreateJobId_LOAD as step s6
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/JobIdSideInput_LOAD/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
 as step s7
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/JobIdSideInput_LOAD/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey
 as step s8
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/JobIdSideInput_LOAD/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
 as step s9
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/JobIdSideInput_LOAD/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 as step s10
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/JobIdSideInput_LOAD/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s11
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/JobIdSideInput_LOAD/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
 as step s12
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/JobIdSideInput_LOAD/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 as step s13
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/JobIdSideInput_LOAD/Combine.GloballyAsSingletonView/CreateDataflowView
 as step s14
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/Create.Values/Read(CreateSource) as 
step s15
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/GetTempFilePrefix as step s16
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
 as step s17
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey
 as step s18
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
 as step s19
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 as step s20
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s21
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
 as step s22
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 as step s23
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/TempFilePrefixView/Combine.GloballyAsSingletonView/CreateDataflowView
 as step s24
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/rewindowIntoGlobal/Window.Assign as 
step s25
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/WriteBundlesToFiles as step s26
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/GroupByDestination as step s27
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/StripShardId/Map as step s28
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/WriteGroupedRecords as step s29
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/FlattenFiles as step s30
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/ReifyResults/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow) as 
step s31
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/ReifyResults/View.AsIterable/CreateDataflowView as step s32
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/ReifyResults/Create.Values/Read(CreateSource) as step s33
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/ReifyResults/ParDo(Anonymous) as step 
s34
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/WritePartitionUntriggered as step s35
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/MultiPartitionsReshuffle/Window.Into()/Window.Assign as step s36
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/MultiPartitionsReshuffle/GroupByKey as 
step s37
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/MultiPartitionsReshuffle/ExpandIterable 
as step s38
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/MultiPartitionsWriteTables/ParMultiDo(WriteTables) as step s39
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/MultiPartitionsWriteTables/WithKeys/AddKeys/Map as step s40
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/MultiPartitionsWriteTables/Window.Into()/Window.Assign as step s41
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/MultiPartitionsWriteTables/GroupByKey 
as step s42
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/MultiPartitionsWriteTables/Values/Values/Map as step s43
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/MultiPartitionsWriteTables/ParDo(GarbageCollectTemporaryFiles) as 
step s44
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/ReifyRenameInput/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)
 as step s45
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/ReifyRenameInput/View.AsIterable/CreateDataflowView as step s46
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/ReifyRenameInput/Create.Values/Read(CreateSource) as step s47
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/ReifyRenameInput/ParDo(Anonymous) as 
step s48
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/WriteRenameUntriggered as step s49
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/SinglePartitionsReshuffle/Window.Into()/Window.Assign as step s50
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/SinglePartitionsReshuffle/GroupByKey as 
step s51
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/SinglePartitionsReshuffle/ExpandIterable as step s52
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/SinglePartitionWriteTables/ParMultiDo(WriteTables) as step s53
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/SinglePartitionWriteTables/WithKeys/AddKeys/Map as step s54
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/SinglePartitionWriteTables/Window.Into()/Window.Assign as step s55
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/BatchLoads/SinglePartitionWriteTables/GroupByKey 
as step s56
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/SinglePartitionWriteTables/Values/Values/Map as step s57
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/SinglePartitionWriteTables/ParDo(GarbageCollectTemporaryFiles) as 
step s58
    Mar 09, 2021 12:44:43 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/BatchLoads/CreateEmptyFailedInserts/Read(CreateSource) as step s59
    Mar 09, 2021 12:44:43 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
    Mar 09, 2021 12:44:44 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <267468 bytes, hash 
e281b81267de4d7bfc45bc69c1336329fa4c9847f88fdc0a2ebcbdbf1e711b7a> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4oG4EmfeTXv8RbxpwTNjKfpMmEf4j9wKLry9vx5xG3o.pb
    Mar 09, 2021 12:44:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.29.0-SNAPSHOT
    Mar 09, 2021 12:44:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-03-08_16_44_44-13384112055063638297?project=apache-beam-testing
    Mar 09, 2021 12:44:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-03-08_16_44_44-13384112055063638297
    Mar 09, 2021 12:44:45 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2021-03-08_16_44_44-13384112055063638297

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead SKIPPED
Watching 1350 directories to track changes
Watching 1352 directories to track changes
Watching 1353 directories to track changes

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for 
':' Thread 10,5,main]) completed. Took 14.451 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 13' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/6.8/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 29s
106 actionable tasks: 105 executed, 1 from cache
Watching 1353 directories to track changes

Publishing build scan...
https://gradle.com/s/g4cojqnkeyvvw

Unexpected exception thrown.
org.gradle.internal.remote.internal.MessageIOException: Could not write 
'/127.0.0.1:47208'.
        at 
org.gradle.internal.remote.internal.inet.SocketConnection.flush(SocketConnection.java:140)
        at 
org.gradle.internal.remote.internal.hub.MessageHub$ConnectionDispatch.run(MessageHub.java:333)
        at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:64)
        at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:48)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:56)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Broken pipe
        at sun.nio.ch.FileDispatcherImpl.write0(Native Method)
        at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47)
        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93)
        at sun.nio.ch.IOUtil.write(IOUtil.java:51)
        at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:470)
        at 
org.gradle.internal.remote.internal.inet.SocketConnection$SocketOutputStream.writeWithNonBlockingRetry(SocketConnection.java:279)
        at 
org.gradle.internal.remote.internal.inet.SocketConnection$SocketOutputStream.writeBufferToChannel(SocketConnection.java:267)
        at 
org.gradle.internal.remote.internal.inet.SocketConnection$SocketOutputStream.flush(SocketConnection.java:261)
        at 
org.gradle.internal.remote.internal.inet.SocketConnection.flush(SocketConnection.java:138)
        ... 7 more

FAILURE: Build failed with an exception.

* What went wrong:
Not all **** daemon(s) could be stopped.
> Process 'Gradle Worker Daemon 6' finished with non-zero exit value 143
> Process 'Gradle Worker Daemon 8' finished with non-zero exit value 143
> Process 'Gradle Worker Daemon 9' finished with non-zero exit value 143
> Process 'Gradle Worker Daemon 10' finished with non-zero exit value 143
> Process 'Gradle Worker Daemon 11' finished with non-zero exit value 143
> Process 'Gradle Worker Daemon 12' finished with non-zero exit value 143

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 30s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to