See
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/1443/display/redirect?page=changes>
Changes:
[noreply] [BEAM-11457] Add option to skip key-value clone (#13543)
------------------------------------------
[...truncated 385.25 KB...]
Jan 04, 2021 6:44:57 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jan 04, 2021 6:44:57 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 228 files. Enable logging at DEBUG level to see
which files will be staged.
Jan 04, 2021 6:44:59 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Jan 04, 2021 6:44:59 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Jan 04, 2021 6:44:59 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`,
`HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND
`HACKER_NEWS`.`score` > 2
Jan 04, 2021 6:44:59 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Jan 04, 2021 6:44:59 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Jan 04, 2021 6:45:00 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
Jan 04, 2021 6:45:00 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR],
expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)],
expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)],
expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5],
$condition=[$t21])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethod FAILED
java.lang.IllegalStateException: Unable to return a default Coder for
BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output
[PCollection@141433091]. Correct one of the following root causes:
No Coder has been manually specified; you may do so using .setCoder().
Inferring a Coder from the CoderRegistry failed: Cannot provide a coder
for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
Using the default output Coder from the producing PTransform failed:
PTransform.getOutputCoder called.
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
at
org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
at
org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
at
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:162)
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDefaultMethod STANDARD_ERROR
Jan 04, 2021 6:45:00 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DEFAULT
Jan 04, 2021 6:45:00 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`,
`HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND
`HACKER_NEWS`.`score` > 2
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DEFAULT
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR],
expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)],
expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)],
expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5],
$condition=[$t21])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDefaultMethod FAILED
java.lang.IllegalStateException: Unable to return a default Coder for
BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output
[PCollection@1977938014]. Correct one of the following root causes:
No Coder has been manually specified; you may do so using .setCoder().
Inferring a Coder from the CoderRegistry failed: Cannot provide a coder
for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
Using the default output Coder from the producing PTransform failed:
PTransform.getOutputCoder called.
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:280)
at
org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:117)
at
org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:153)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:546)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:498)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:72)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:42)
at
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:176)
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_ERROR
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`,
`HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND
`HACKER_NEWS`.`score` > 2
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by,
type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8,
'job'))>($5, 2)}, unsupported{}]])
Jan 04, 2021 6:45:01 PM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable
buildIOReader
INFO: Pushing down the following filter: (`type` = 'story' OR `type` =
'job') AND `score` > 2
Jan 04, 2021 6:45:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Jan 04, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 229 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Jan 04, 2021 6:45:05 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-worker.jar as
beam-runners-google-cloud-dataflow-java-legacy-worker-2.28.0-SNAPSHOT-yJs-ekoQlgpdzGvvlXPsQCJWEfiGBRzmuux0kGrd2PA.jar
Jan 04, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading /tmp/test1515115725002049070.zip to
gs://temp-storage-for-perf-tests/loadtests/staging/test-WOgaOUli1ZAapM7Y50ZST9mmGZzSU73ujKmqj3jGVBE.jar
Jan 04, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.28.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.28.0-SNAPSHOT-0Rvubj6CutKP_eMLlF-Pc5nUJxN4wLLJkMoGEem0Wxw.jar
Jan 04, 2021 6:45:05 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.28.0-SNAPSHOT-unshaded.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.28.0-SNAPSHOT-unshaded-IMr7l0eIGuZ83KXwxhxkC1_kY1Hot0AiHh7EHlbZmV0.jar
Jan 04, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.duct-tape/duct-tape/1.0.8/92edc22a9ab2f3e17c9bf700aaee377d50e8b530/duct-tape-1.0.8.jar
to
gs://temp-storage-for-perf-tests/loadtests/staging/duct-tape-1.0.8-Mc7xLd7JedH4bXz3CMQaF9pSPQXGhf1mQunQsq3bckA.jar
Jan 04, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.rnorth.visible-assertions/visible-assertions/2.1.2/20d31a578030ec8e941888537267d3123c2ad1c1/visible-assertions-2.1.2.jar
to
gs://temp-storage-for-perf-tests/loadtests/staging/visible-assertions-2.1.2-RQSulosjfNzcto_1sHqmOr5JkvkHp3w9YSCqm5BBQBw.jar
Jan 04, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
/home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna-platform/4.0.0/deb6bf66918989b50209b8c9aaf3b2561af7f011/jna-platform-4.0.0.jar
to
gs://temp-storage-for-perf-tests/loadtests/staging/jna-platform-4.0.0-B21i7Yfna9yzdQ_-gKpJXuIRK9chJmOVTWVH9IJEkuk.jar
Jan 04, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
/home/jenkins/.gradle/caches/modules-2/files-2.1/net.java.dev.jna/jna/5.5.0/e0845217c4907822403912ad6828d8e0b256208/jna-5.5.0.jar
to
gs://temp-storage-for-perf-tests/loadtests/staging/jna-5.5.0-swj66_5O1AnehBDgpjLRZLISawNfbqz_lo05CMr7TZ4.jar
Jan 04, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 222 files cached, 7 files newly uploaded in 1
seconds
Jan 04, 2021 6:45:06 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource) as step s1
Jan 04, 2021 6:45:06 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
Jan 04, 2021 6:45:06 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
Jan 04, 2021 6:45:06 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s4
Jan 04, 2021 6:45:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://temp-storage-for-perf-tests/loadtests/staging/
Jan 04, 2021 6:45:06 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <100177 bytes, hash
b094a52da632d453e7c6ae4ff48b084e02358f13e8f79ed2734ef498ad9c5cba> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-sJSlLaYy1FPnxq5P9IsITgI1jxPo957Sc070mK2cXLo.pb
Jan 04, 2021 6:45:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.28.0-SNAPSHOT
Jan 04, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-01-04_10_45_07-4615965615321343589?project=apache-beam-testing
Jan 04, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-01-04_10_45_07-4615965615321343589
Jan 04, 2021 6:45:08 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2021-01-04_10_45_07-4615965615321343589
Jan 04, 2021 6:45:08 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-01-04T18:45:07.167Z: The requested max number of workers (5)
is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:13.849Z: Worker configuration: n1-standard-1 in
us-central1-f.
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:14.500Z: Expanding CoGroupByKey operations into
optimizable parts.
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:14.539Z: Expanding GroupByKey operations into
optimizable parts.
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:14.566Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:14.641Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:14.668Z: Fusing consumer
BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into
BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:14.697Z: Fusing consumer BeamCalcRel_285/ParDo(Calc)
into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:14.730Z: Fusing consumer ParDo(TimeMonitor) into
BeamCalcRel_285/ParDo(Calc)
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:15.171Z: Executing operation
BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
Jan 04, 2021 6:45:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:15.240Z: Starting 5 workers in us-central1-f...
Jan 04, 2021 6:45:30 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:29.460Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jan 04, 2021 6:45:37 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:37.400Z: Autoscaling: Raised the number of workers
to 4 based on the rate of progress in the currently running stage(s).
Jan 04, 2021 6:45:37 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:37.422Z: Resized worker pool to 4, though goal was
5. This could be a quota issue.
Jan 04, 2021 6:45:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:45:47.631Z: Autoscaling: Raised the number of workers
to 5 based on the rate of progress in the currently running stage(s).
Jan 04, 2021 6:46:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:46:06.406Z: Workers have started successfully.
Jan 04, 2021 6:46:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:46:06.431Z: Workers have started successfully.
Jan 04, 2021 6:46:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:46:36.568Z: Finished operation
BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
Jan 04, 2021 6:46:38 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:46:36.688Z: Cleaning up.
Jan 04, 2021 6:46:38 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:46:36.764Z: Stopping worker pool...
Jan 04, 2021 6:47:23 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:47:21.977Z: Autoscaling: Resized worker pool from 5 to
0.
Jan 04, 2021 6:47:23 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-01-04T18:47:22.018Z: Worker pool stopped.
Jan 04, 2021 6:47:27 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-01-04_10_45_07-4615965615321343589 finished with status DONE.
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_OUT
Load test results for test (ID): 1e1550b1-af73-43a2-819b-69a2471d2a0a and
timestamp: 2021-01-04T18:47:27.897000000Z:
Metric: Value:
read_time 11.646
fields_read 4375276.0
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_ERROR
Jan 04, 2021 6:47:28 PM
org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
WARNING: Missing property -- measurement/database. Metrics won't be
published.
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
3 tests completed, 2 failed
Finished generating test XML results (0.022 secs) into:
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into:
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
Watching 2235 directories to track changes
Watching 2241 directories to track changes
Watching 2242 directories to track changes
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker
for ':' Thread 10,5,main]) completed. Took 2 mins 35.102 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.7.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 3m 11s
145 actionable tasks: 92 executed, 53 from cache
Watching 2242 directories to track changes
Publishing build scan...
https://gradle.com/s/5oj4ezrfffrmg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]