See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2761/display/redirect?page=changes>
Changes: [stranniknm] [BEAM-13112]: playground embedded version [noreply] [BEAM-13236] Properly close kinesis producer on teardown (#15955) [noreply] Merge pull request #16150 from [BEAM-13396][Playground][Bugfix] Issues [noreply] Merge pull request #16148 from [BEAM-13394][Playground] [Bugfix] Fix [noreply] Avoid overriding explicit portable job submission disabling. (#16143) [noreply] Merge pull request #16151 from [BEAM-13350][Playground] Support running [zyichi] [BEAM-13373] Increase python post commit timeout to reduce chance of ------------------------------------------ [...truncated 337.86 KB...] > Task :sdks:java:extensions:sql:perf-tests:testClasses UP-TO-DATE Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions. :sdks:java:extensions:sql:perf-tests:testClasses (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 0.0 secs. :sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) started. Gradle Test Executor 1 started executing tests. > Task :sdks:java:extensions:sql:perf-tests:integrationTest Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'. Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 1c1ba0c23229ca86f98761e325db1a59 Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because: Task.upToDateWhen is false. Starting process 'Gradle Test Executor 1'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.36.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.9.1/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 1' Successfully started process 'Gradle Test Executor 1' org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.36.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]> SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory] org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR Dec 07, 2021 6:44:51 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage. Dec 07, 2021 6:44:52 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Dec 07, 2021 6:44:52 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 251 files. Enable logging at DEBUG level to see which files will be staged. Dec 07, 2021 6:44:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DIRECT_READ Dec 07, 2021 6:44:54 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Dec 07, 2021 6:44:55 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQL: SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score` FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS` WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2 Dec 07, 2021 6:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DIRECT_READ Dec 07, 2021 6:44:55 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Dec 07, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQLPlan> LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5]) LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) Dec 07, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: BEAMPlan> BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1535841857]. Correct one of the following root causes: No Coder has been manually specified; you may do so using .setCoder(). Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema. Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called. at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507) at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286) at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117) at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154) at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547) at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499) at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:106) at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:48) at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:150) org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR Dec 07, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DEFAULT Dec 07, 2021 6:44:56 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQL: SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score` FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS` WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2 Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DEFAULT Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQLPlan> LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5]) LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: BEAMPlan> BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@2311926]. Correct one of the following root causes: No Coder has been manually specified; you may do so using .setCoder(). Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema. Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called. at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507) at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286) at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117) at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154) at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547) at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499) at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:106) at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:48) at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:163) org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DIRECT_READ Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQL: SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score` FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS` WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2 Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DIRECT_READ Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQLPlan> LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5]) LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) Dec 07, 2021 6:44:57 PM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: BEAMPlan> BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]]) Dec 07, 2021 6:44:58 PM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2 Dec 07, 2021 6:44:58 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Dec 07, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 252 files from PipelineOptions.filesToStage to staging location to prepare for execution. Dec 07, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.36.0-SNAPSHOT-C8O8KfNYPV3WbkH4wo9YAziq_fRC_ckq-Y_KfwDM9lg.jar Dec 07, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /tmp/test7416703788180989516.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-insGK2GDuQ3UV83jd0bReIZSl8s2ybYxH-gG2GDmWVs.jar Dec 07, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 251 files cached, 1 files newly uploaded in 0 seconds Dec 07, 2021 6:45:01 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Dec 07, 2021 6:45:01 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <106246 bytes, hash b89daf15e97c8a1658d9a4940684f1bb8867758fed4fe04493dadee8fb6fa4e3> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-uJ2vFel8ihZY2aSUBoTxu4hndY_tT-BEk9re6PtvpOM.pb Dec 07, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1 Dec 07, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2 Dec 07, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor) as step s3 Dec 07, 2021 6:45:04 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.36.0-SNAPSHOT Dec 07, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-12-07_10_45_04-16102847005114425407?project=apache-beam-testing Dec 07, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-12-07_10_45_04-16102847005114425407 Dec 07, 2021 6:45:05 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-12-07_10_45_04-16102847005114425407 Dec 07, 2021 6:45:13 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-12-07T18:45:07.814Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE). Dec 07, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:21.414Z: Worker configuration: e2-standard-2 in us-central1-c. Dec 07, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:22.248Z: Expanding CoGroupByKey operations into optimizable parts. Dec 07, 2021 6:45:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:22.520Z: Expanding GroupByKey operations into optimizable parts. Dec 07, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:22.681Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Dec 07, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:23.106Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Dec 07, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:23.234Z: Fusing consumer BeamPushDownIOSourceRel_272/ParDo(RowMonitor) into BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) Dec 07, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:23.277Z: Fusing consumer ParDo(TimeMonitor) into BeamPushDownIOSourceRel_272/ParDo(RowMonitor) Dec 07, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:23.687Z: Executing operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor) Dec 07, 2021 6:45:25 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:23.765Z: Starting 5 workers in us-central1-c... Dec 07, 2021 6:45:39 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:45:38.694Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Dec 07, 2021 6:46:09 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:46:07.513Z: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s). Dec 07, 2021 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:46:33.466Z: Workers have started successfully. Dec 07, 2021 6:46:33 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:46:33.527Z: Workers have started successfully. Dec 07, 2021 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:47:04.259Z: Finished operation BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_272/ParDo(RowMonitor)+ParDo(TimeMonitor) Dec 07, 2021 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:47:04.928Z: Cleaning up. Dec 07, 2021 6:47:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:47:05.039Z: Stopping worker pool... Dec 07, 2021 6:49:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:49:37.480Z: Autoscaling: Resized worker pool from 5 to 0. Dec 07, 2021 6:49:38 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-12-07T18:49:37.520Z: Worker pool stopped. Dec 07, 2021 6:49:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-12-07_10_45_04-16102847005114425407 finished with status DONE. org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT Load test results for test (ID): ab86fe47-8019-4f08-82c4-d7c6593da8a5 and timestamp: 2021-12-07T18:49:46.619000000Z: Metric: Value: read_time 7.349 fields_read 4375276.0 org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR Dec 07, 2021 6:49:46 PM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck WARNING: Missing property -- measurement/database. Metrics won't be published. Gradle Test Executor 1 finished executing tests. > Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED 3 tests completed, 2 failed Finished generating test XML results (0.022 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest> Generating HTML test report... Finished generating test html results (0.028 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest> :sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker for ':' Thread 10,5,main]) completed. Took 4 mins 58.865 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'. > There were failing tests. See the report at: > file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 5m 28s 152 actionable tasks: 96 executed, 56 from cache Publishing build scan... https://gradle.com/s/35t66owvie3ki Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
