See <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/1799/display/redirect>
Changes: ------------------------------------------ [...truncated 276.70 KB...] Watching 1358 directories to track changes Watching 1358 directories to track changes Watching 1369 directories to track changes Watching 1370 directories to track changes Watching 1371 directories to track changes Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key 82554694bc744fd2b664a944a893568d :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.165 secs. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' Thread 10,5,main]) started. > Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' Thread 10,5,main]) started. > Task :runners:google-cloud-dataflow-java:testJar Watching 1371 directories to track changes Could not read file path '<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/resources/test'.> Watching 1371 directories to track changes Watching 1372 directories to track changes Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because: Caching has not been enabled for the task Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because: No history is available. Watching 1372 directories to track changes file or directory '<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found Watching 1372 directories to track changes :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 0.034 secs. :sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for ':' Thread 10,5,main]) started. Gradle Test Executor 1 started executing tests. > Task :sdks:java:io:bigquery-io-perf-tests:integrationTest Watching 1372 directories to track changes Watching 1372 directories to track changes Watching 1372 directories to track changes Watching 1372 directories to track changes Watching 1372 directories to track changes Custom actions are attached to task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'. Build cache key for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is 773355d993f02259dd57acf3f1d9c738 Task ':sdks:java:io:bigquery-io-perf-tests:integrationTest' is not up-to-date because: Task.upToDateWhen is false. Watching 1372 directories to track changes Watching 1372 directories to track changes Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/io/bigquery-io-perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--writeMethod=STREAMING_INSERTS","--writeFormat=JSON","--testBigQueryDataset=beam_performance","--testBigQueryTable=bqio_write_10GB_java_stream_1229050418","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=bqio_10GB_results_java_stream","--influxMeasurement=bqio_10GB_results_java_stream","--sourceOptions={\"numRecords\":\"10485760\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"1024\"}","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.28.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.7.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1' Successfully started process 'Gradle Test Executor 1' org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT STANDARD_ERROR SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.28.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]> SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory] org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead STANDARD_ERROR Dec 29, 2020 6:33:43 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Dec 29, 2020 6:33:43 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 210 files. Enable logging at DEBUG level to see which files will be staged. Dec 29, 2020 6:33:45 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Dec 29, 2020 6:33:46 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 211 files from PipelineOptions.filesToStage to staging location to prepare for execution. Dec 29, 2020 6:33:46 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.28.0-SNAPSHOT-SY51_ntnFm08YbRxJRsH0MUwRdL3yw8xnIg9nfCgmLo.jar Dec 29, 2020 6:33:46 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /tmp/test9136733284915429722.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-Xlgc3H_lXDd3wY2BZFvggSUQmeiScudEzhjZNcmQYEQ.jar Dec 29, 2020 6:33:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.28.0-SNAPSHOT-unshaded.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.28.0-SNAPSHOT-unshaded-BJ-J9wOVNYzlw79Xvp4mOLVALrUcDIQbLv1cNRhyalM.jar Dec 29, 2020 6:33:47 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.28.0-SNAPSHOT.jar> to gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.28.0-SNAPSHOT-2e9AFfCDzF5k8gXJmq8sK7suG1o-XBJPRgfM--hJRac.jar Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 208 files cached, 3 files newly uploaded in 2 seconds Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read from source as step s1 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Gather time as step s2 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Map records as step s3 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) as step s5 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign as step s8 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite as step s12 Dec 29, 2020 6:33:48 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/loadtests/staging/ Dec 29, 2020 6:33:49 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <109328 bytes, hash b0efa82427eacb5d2d1effc14c271178aa6626daca8e20b1ef18ae0f16eb2331> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sO-oJCfqy10tHv_BTCcReKpmJtrKjiCx7xiuDxbrIzE.pb Dec 29, 2020 6:33:49 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.28.0-SNAPSHOT Dec 29, 2020 6:33:51 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-28_22_33_49-6776631088243144278?project=apache-beam-testing Dec 29, 2020 6:33:51 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2020-12-28_22_33_49-6776631088243144278 Dec 29, 2020 6:33:51 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2020-12-28_22_33_49-6776631088243144278 Dec 29, 2020 6:33:51 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2020-12-29T06:33:49.639Z: The requested max number of ****s (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE). Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:58.201Z: Worker configuration: n1-standard-1 in us-central1-f. Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:58.715Z: Expanding CoGroupByKey operations into optimizable parts. Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:58.792Z: Expanding GroupByKey operations into optimizable parts. Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:58.833Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:58.919Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:58.944Z: Fusing consumer Gather time into Read from source Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:58.970Z: Fusing consumer Map records into Gather time Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:58.995Z: Fusing consumer Write to BQ/PrepareWrite/ParDo(Anonymous) into Map records Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.022Z: Fusing consumer Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to BQ/PrepareWrite/ParDo(Anonymous) Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.056Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.085Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.109Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.139Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.164Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.200Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.241Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.269Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable Dec 29, 2020 6:33:59 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.305Z: Fusing consumer Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite into Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign Dec 29, 2020 6:34:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.875Z: Executing operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create Dec 29, 2020 6:34:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:33:59.945Z: Starting 5 ****s in us-central1-f... Dec 29, 2020 6:34:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:34:00.027Z: Finished operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create Dec 29, 2020 6:34:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:34:00.170Z: Executing operation Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write Dec 29, 2020 6:34:21 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:34:19.109Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Dec 29, 2020 6:34:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:34:29.710Z: Autoscaling: Raised the number of ****s to 1 based on the rate of progress in the currently running stage(s). Dec 29, 2020 6:34:31 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:34:29.754Z: Resized **** pool to 1, though goal was 5. This could be a quota issue. Dec 29, 2020 6:34:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:34:39.956Z: Autoscaling: Raised the number of ****s to 4 based on the rate of progress in the currently running stage(s). Dec 29, 2020 6:34:41 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:34:39.989Z: Resized **** pool to 4, though goal was 5. This could be a quota issue. Dec 29, 2020 6:34:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:34:54.967Z: Workers have started successfully. Dec 29, 2020 6:34:56 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:34:54.994Z: Workers have started successfully. Dec 29, 2020 6:35:11 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:35:10.648Z: Autoscaling: Raised the number of ****s to 5 based on the rate of progress in the currently running stage(s). Dec 29, 2020 6:37:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:37:01.743Z: Finished operation Read from source+Gather time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write Dec 29, 2020 6:37:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:37:01.808Z: Executing operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close Dec 29, 2020 6:37:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:37:01.857Z: Finished operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close Dec 29, 2020 6:37:02 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2020-12-29T06:37:01.934Z: Executing operation Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to BQ/StreamingInserts/StreamingWriteTables/StreamingWrite org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead SKIPPED Watching 1374 directories to track changes Watching 1376 directories to track changes Watching 1377 directories to track changes > Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED :sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for ':' Thread 10,5,main]) completed. Took 5 mins 56.621 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'. > Process 'Gradle Test Executor 1' finished with non-zero exit value 143 This problem might be caused by incorrect test process configuration. Please refer to the test execution section in the User Manual at https://docs.gradle.org/6.7.1/userguide/java_testing.html#sec:test_execution * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 6m 17s 109 actionable tasks: 66 executed, 43 from cache Watching 1377 directories to track changes Publishing build scan... The message received from the daemon indicates that the daemon has disappeared. Build request sent: Build{id=2c7e599c-2608-4431-85e5-99c78eee8547, currentDir=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src}> Attempting to read last messages from the daemon log... Daemon pid: 27901 log file: /home/jenkins/.gradle/daemon/6.7.1/daemon-27901.out.log ----- Last 20 lines from daemon log file - daemon-27901.out.log ----- Execution failed for task ':sdks:java:io:bigquery-io-perf-tests:integrationTest'. > Process 'Gradle Test Executor 1' finished with non-zero exit value 143 This problem might be caused by incorrect test process configuration. Please refer to the test execution section in the User Manual at https://docs.gradle.org/6.7.1/userguide/java_testing.html#sec:test_execution * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 6m 17s 109 actionable tasks: 66 executed, 43 from cache Watching 1377 directories to track changes Publishing build scan... Daemon vm is shutting down... The daemon has exited normally or was terminated in response to a user interrupt. ----- End of the daemon log ----- FAILURE: Build failed with an exception. * What went wrong: Gradle build daemon disappeared unexpectedly (it may have been killed or may have crashed) * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
