See <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/2406/display/redirect?page=changes>
Changes: [ruwan.lambrichts] Clarify additional_bq_parameters argument [noreply] Fix broken 'differences from pandas' link [noreply] Added GroupBy row in Aggregation table. [Etienne Chauchot] [BEAM-5172] Temporary ignore testSplit and testSizes tests waiting for a [samuelw] [BEAM-12740] Remove matching to filter files when renaming gcs files in [noreply] [BEAM-3304] Helper functions for triggers (#15430) [esert] Bump a throttling counter on BigQueryRead retries due to ------------------------------------------ [...truncated 346.51 KB...] Caching disabled for task ':sdks:java:extensions:sql:testJar' because: Caching has not been enabled for the task Task ':sdks:java:extensions:sql:testJar' is not up-to-date because: No history is available. :sdks:java:extensions:sql:testJar (Thread[Daemon worker,5,main]) completed. Took 0.133 secs. :sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Daemon worker,5,main]) started. > Task :sdks:java:extensions:sql:perf-tests:compileTestJava Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:compileTestJava'. Build cache key for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is 8b54be572e3a891ec50ae23f0cd874b6 Task ':sdks:java:extensions:sql:perf-tests:compileTestJava' is not up-to-date because: No history is available. The input changes require a full rebuild for incremental task ':sdks:java:extensions:sql:perf-tests:compileTestJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. Compiling with JDK Java compiler API. Created classpath snapshot for incremental compilation in 0.125 secs. 3324 duplicate classes found in classpath (see all with --debug). Stored cache entry for task ':sdks:java:extensions:sql:perf-tests:compileTestJava' with cache key 8b54be572e3a891ec50ae23f0cd874b6 :sdks:java:extensions:sql:perf-tests:compileTestJava (Thread[Daemon worker,5,main]) completed. Took 1.934 secs. :sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) started. > Task :sdks:java:extensions:sql:perf-tests:testClasses Skipping task ':sdks:java:extensions:sql:perf-tests:testClasses' as it has no actions. :sdks:java:extensions:sql:perf-tests:testClasses (Thread[Daemon worker,5,main]) completed. Took 0.0 secs. :sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) started. Gradle Test Executor 4 started executing tests. > Task :sdks:java:extensions:sql:perf-tests:integrationTest Custom actions are attached to task ':sdks:java:extensions:sql:perf-tests:integrationTest'. Build cache key for task ':sdks:java:extensions:sql:perf-tests:integrationTest' is 52158865a6a1134246d7f0cd77fcdbe5 Task ':sdks:java:extensions:sql:perf-tests:integrationTest' is not up-to-date because: Task.upToDateWhen is false. Starting process 'Gradle Test Executor 4'. Working directory: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--project=apache-beam-testing","--tempLocation=gs://temp-storage-for-perf-tests/loadtests","--tempRoot=gs://temp-storage-for-perf-tests/loadtests","--metricsBigQueryDataset=beam_performance","--metricsBigQueryTable=sql_bqio_read_java_batch","--runner=DataflowRunner","--maxNumWorkers=5","--numWorkers=5","--autoscalingAlgorithm=NONE","--workerHarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=worker.org.gradle.process.internal.worker.child.BootstrapSecurityManager -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/6.8.3/workerMain/gradle-worker.jar worker.org.gradle.process.internal.worker.GradleWorkerMain 'Gradle Test Executor 4' Successfully started process 'Gradle Test Executor 4' org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT STANDARD_ERROR SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]> SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory] org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod STANDARD_ERROR Sep 10, 2021 12:46:32 AM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions WARNING: Prefer --sdkContainerImage over deprecated legacy option --workerHarnessContainerImage. Sep 10, 2021 12:46:33 AM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Sep 10, 2021 12:46:34 AM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 249 files. Enable logging at DEBUG level to see which files will be staged. Sep 10, 2021 12:46:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DIRECT_READ Sep 10, 2021 12:46:37 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Sep 10, 2021 12:46:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQL: SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score` FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS` WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2 Sep 10, 2021 12:46:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DIRECT_READ Sep 10, 2021 12:46:38 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Sep 10, 2021 12:46:38 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQLPlan> LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5]) LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: BEAMPlan> BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethod FAILED java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_4/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@1386607672]. Correct one of the following root causes: No Coder has been manually specified; you may do so using .setCoder(). Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema. Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called. at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507) at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286) at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117) at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154) at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547) at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499) at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81) at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42) at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162) org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod STANDARD_ERROR Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DEFAULT Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQL: SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score` FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS` WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2 Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DEFAULT Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQLPlan> LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5]) LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) Sep 10, 2021 12:46:39 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: BEAMPlan> BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], $condition=[$t21]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDefaultMethod FAILED java.lang.IllegalStateException: Unable to return a default Coder for BeamIOSourceRel_118/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output [PCollection@725804086]. Correct one of the following root causes: No Coder has been manually specified; you may do so using .setCoder(). Inferring a Coder from the CoderRegistry failed: Cannot provide a coder for a Beam Row. Please provide a schema instead using PCollection.setRowSchema. Using the default output Coder from the producing PTransform failed: PTransform.getOutputCoder called. at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507) at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:286) at org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117) at org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:154) at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:547) at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:499) at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:81) at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42) at org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176) org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DIRECT_READ Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQL: SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, `HACKER_NEWS`.`title`, `HACKER_NEWS`.`score` FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS` WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND `HACKER_NEWS`.`score` > 2 Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery method is set to: DIRECT_READ Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init> INFO: BigQuery writeDisposition is set to: WRITE_EMPTY Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: SQLPlan> LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5]) LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))]) BeamIOSourceRel(table=[[beam, HACKER_NEWS]]) Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel INFO: BEAMPlan> BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 'job'))>($5, 2)}, unsupported{}]]) Sep 10, 2021 12:46:40 AM org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable buildIOReader INFO: Pushing down the following filter: (type = 'story' OR type = 'job') AND score > 2 Sep 10, 2021 12:46:40 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Sep 10, 2021 12:46:44 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 250 files from PipelineOptions.filesToStage to staging location to prepare for execution. Sep 10, 2021 12:46:44 AM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage INFO: Staging custom dataflow-worker.jar as beam-runners-google-cloud-dataflow-java-legacy-worker-2.34.0-SNAPSHOT-ucAidJrqxK96xuFPkyoKLCyoB4CvF-dHe3bPkncTQpU.jar Sep 10, 2021 12:46:44 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading /tmp/test3301184806270174235.zip to gs://temp-storage-for-perf-tests/loadtests/staging/test-6EXH181S_JuJL-rLXn0F5Zi7ecJnf0DFyDuy8-YYA14.jar Sep 10, 2021 12:46:45 AM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 249 files cached, 1 files newly uploaded in 0 seconds Sep 10, 2021 12:46:45 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Sep 10, 2021 12:46:45 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <104003 bytes, hash 286996d6b9ece45dc905ba2aa66565d4ecb08fdf6684b03495788253c30363d0> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-KGmW1rns5F3JBboqpmVl1Oywj99mhLA0lXiCU8MDY9A.pb Sep 10, 2021 12:46:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding BeamPushDownIOSourceRel_272/Read Input BQ Rows with push-down/Read(BigQueryStorageTableSource) as step s1 Sep 10, 2021 12:46:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding BeamPushDownIOSourceRel_272/ParDo(RowMonitor) as step s2 Sep 10, 2021 12:46:47 AM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor) as step s3 Sep 10, 2021 12:46:48 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.34.0-SNAPSHOT Sep 10, 2021 12:46:49 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-09_17_46_48-13252814087671969139?project=apache-beam-testing Sep 10, 2021 12:46:49 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-09-09_17_46_48-13252814087671969139 Sep 10, 2021 12:46:49 AM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2021-09-09_17_46_48-13252814087671969139 Sep 10, 2021 12:46:54 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-09-10T00:46:52.073Z: The requested max number of workers (5) is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE). Sep 10, 2021 12:47:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-10T00:46:59.051Z: Worker configuration: e2-standard-2 in us-central1-a. Sep 10, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process SEVERE: 2021-09-10T00:47:00.326Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/19495 instances, 2/0 CPUs, 25/247716 disk GB, 0/2397 SSD disk GB, 1/272 instance groups, 1/275 managed instance groups, 1/501 instance templates, 1/724 in-use IP addresses. Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota. Sep 10, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-10T00:47:00.409Z: Cleaning up. Sep 10, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-10T00:47:00.504Z: Worker pool stopped. Sep 10, 2021 12:47:01 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-10T00:47:01.762Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Sep 10, 2021 12:47:06 AM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-09-09_17_46_48-13252814087671969139 failed with status FAILED. Sep 10, 2021 12:47:06 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT Sep 10, 2021 12:47:06 AM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric SEVERE: Failed to get metric fields_read, from namespace org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_OUT Load test results for test (ID): db9d6a6b-3f3e-419b-a5af-88970102aeda and timestamp: 2021-09-10T00:47:06.309000000Z: Metric: Value: read_time 0.0 fields_read -1.0 org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT > readUsingDirectReadMethodPushDown STANDARD_ERROR Sep 10, 2021 12:47:06 AM org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck WARNING: Missing property -- measurement/database. Metrics won't be published. Gradle Test Executor 4 finished executing tests. > Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED 3 tests completed, 2 failed Finished generating test XML results (0.029 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest> Generating HTML test report... Finished generating test html results (0.034 secs) into: <https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest> :sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon worker,5,main]) completed. Took 38.058 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:extensions:sql:perf-tests:integrationTest'. > There were failing tests. See the report at: > file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html> * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 2m 44s 152 actionable tasks: 104 executed, 48 from cache Publishing build scan... https://gradle.com/s/twy4oukgudnbu Stopped 3 worker daemon(s). Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
