See 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/2188/display/redirect>

Changes:


------------------------------------------
[...truncated 271.05 KB...]
Watching 1347 directories to track changes
Watching 1358 directories to track changes
Watching 1359 directories to track changes
Watching 1360 directories to track changes
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:compileTestJava' with cache key 
ab988b3d50b810ddcc7dd2984d135671
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for 
':' Thread 11,5,main]) completed. Took 0.299 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' 
Thread 11,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no 
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' 
Thread 11,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for 
':',5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Watching 1360 directories to track changes
Watching 1360 directories to track changes
Watching 1361 directories to track changes
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
Watching 1361 directories to track changes
file or directory 
'<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/resources/test',>
 not found
Watching 1361 directories to track changes
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for 
':',5,main]) completed. Took 0.041 secs.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE
Watching 1337 directories to track changes
Watching 1347 directories to track changes
Watching 1347 directories to track changes
Watching 1348 directories to track changes
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Build cache key for task 
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is 
0cef5f37d1198503d7bf59180796c505
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not 
up-to-date because:
  No history is available.
Watching 1348 directories to track changes
Watching 1362 directories to track changes
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' with cache key 
0cef5f37d1198503d7bf59180796c505
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar 
(Thread[Execution **** for ':' Thread 5,5,main]) completed. Took 0.688 secs.
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for 
':' Thread 5,5,main]) started.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest
Watching 1362 directories to track changes
Watching 1362 directories to track changes
Watching 1362 directories to track changes
Watching 1362 directories to track changes
Watching 1362 directories to track changes
Custom actions are attached to task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' 
is a03cc35ad170f60be27ec167cc868fef
Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date 
because:
  Task.upToDateWhen is false.
Watching 1362 directories to track changes
Watching 1362 directories to track changes
Starting process 'Gradle Test Executor 1'. Working directory: 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=STREAMING_INSERTS","--writeFormat=JSON","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java_stream_0403050454","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_stream","--influxMeasurement=bqio_10GB_results_java_stream","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.30.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/6.8/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.30.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead 
STANDARD_ERROR
    Apr 03, 2021 6:33:45 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Apr 03, 2021 6:33:46 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 207 files. Enable logging at DEBUG level to see 
which files will be staged.
    Apr 03, 2021 6:33:47 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Apr 03, 2021 6:33:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
    Apr 03, 2021 6:33:50 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <136587 bytes, hash 
f4db95652a96034e2dca45092f92a24b4fdc4d0bf79ba10cb90909883d672f8e> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-9NuVZSqWA04tykUJL5KiS0_cTQv3m6EMuQkJiD1nL44.pb
    Apr 03, 2021 6:33:51 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 208 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Apr 03, 2021 6:33:51 AM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.30.0-SNAPSHOT-r1Q-U6K1C7rXgbO5tW1dJecV9hejJnRIdqvIGLE-UrI.jar
    Apr 03, 2021 6:33:51 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/test6848334070971704050.zip to 
gs://temp-storage-for-perf-tests/loadtests/staging/test-0iPRsdVAxTQJtlXdpfC4mXwGvDu167fz-Y7pGI3xB-g.jar
    Apr 03, 2021 6:33:53 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 207 files cached, 1 files newly uploaded in 2 
seconds
    Apr 03, 2021 6:33:53 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from source as step s1
    Apr 03, 2021 6:33:53 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Gather time as step s2
    Apr 03, 2021 6:33:53 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records as step s3
    Apr 03, 2021 6:33:53 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
    Apr 03, 2021 6:33:53 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) 
as step s5
    Apr 03, 2021 6:33:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6
    Apr 03, 2021 6:33:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7
    Apr 03, 2021 6:33:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
as step s8
    Apr 03, 2021 6:33:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9
    Apr 03, 2021 6:33:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10
    Apr 03, 2021 6:33:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11
    Apr 03, 2021 6:33:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/StripShardId/Map as step s12
    Apr 03, 2021 6:33:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite/BatchedStreamingWrite.ViaBundleFinalization/ParMultiDo(BatchAndInsertElements)
 as step s13
    Apr 03, 2021 6:33:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.30.0-SNAPSHOT
    Apr 03, 2021 6:33:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-04-02_23_33_54-15790852338363470994?project=apache-beam-testing
    Apr 03, 2021 6:33:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-04-02_23_33_54-15790852338363470994
    Apr 03, 2021 6:33:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2021-04-02_23_33_54-15790852338363470994
    Apr 03, 2021 6:33:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-04-03T06:33:57.584Z: The requested max number of ****s (5) is 
ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:02.252Z: Worker configuration: n1-standard-1 in 
us-central1-b.
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:02.868Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:02.955Z: Expanding GroupByKey operations into 
optimizable parts.
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:02.991Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.114Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.158Z: Fusing consumer Gather time into Read from 
source
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.186Z: Fusing consumer Map records into Gather time
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.243Z: Fusing consumer Write to 
BQ/PrepareWrite/ParDo(Anonymous) into Map records
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.299Z: Fusing consumer Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to 
BQ/PrepareWrite/ParDo(Anonymous)
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.356Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.401Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.446Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.476Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write 
to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.500Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.523Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow 
into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.558Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.585Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.621Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/StripShardId/Map into Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:03.655Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite/BatchedStreamingWrite.ViaBundleFinalization/ParMultiDo(BatchAndInsertElements)
 into Write to BQ/StreamingInserts/StreamingWriteTables/StripShardId/Map
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:04.052Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:04.146Z: Starting 5 ****s in us-central1-b...
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:04.471Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Apr 03, 2021 6:34:04 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:04.624Z: Executing operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Apr 03, 2021 6:34:21 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:21.450Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Apr 03, 2021 6:34:46 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:45.634Z: Autoscaling: Raised the number of ****s to 
1 based on the rate of progress in the currently running stage(s).
    Apr 03, 2021 6:34:46 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:45.667Z: Resized **** pool to 1, though goal was 5.  
This could be a quota issue.
    Apr 03, 2021 6:34:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:34:56.014Z: Autoscaling: Raised the number of ****s to 
5 based on the rate of progress in the currently running stage(s).
    Apr 03, 2021 6:35:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:35:25.687Z: Workers have started successfully.
    Apr 03, 2021 6:35:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-04-03T06:35:25.712Z: Workers have started successfully.

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead SKIPPED
Watching 1364 directories to track changes
Watching 1366 directories to track changes
Watching 1367 directories to track changes

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for 
':' Thread 5,5,main]) completed. Took 1 mins 47.244 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 1' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/6.8/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 5s
106 actionable tasks: 64 executed, 42 from cache
Watching 1367 directories to track changes

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=33a15b2e-9c36-480d-9d0a-aed7c1d40abd, 
currentDir=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 16629
  log file: /home/jenkins/.gradle/daemon/6.8/daemon-16629.out.log
----- Last  20 lines from daemon log file - daemon-16629.out.log -----
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 1' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/6.8/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 5s
106 actionable tasks: 64 executed, 42 from cache
Watching 1367 directories to track changes

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to