See
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/334/display/redirect?page=changes>
Changes:
[robertwb] [BEAM-9340] Populate requirement for timer families.
[pabloem] Revert "Merge pull request #11104 from y1chi/update_tornado_test"
[daniel.o.programmer] [BEAM-9642] Create runtime invokers for SDF methods.
------------------------------------------
[...truncated 279.28 KB...]
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethod FAILED
java.lang.IllegalStateException: Unable to return a default Coder for
BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output
[PCollection]. Correct one of the following root causes:
No Coder has been manually specified; you may do so using .setCoder().
Inferring a Coder from the CoderRegistry failed: Cannot provide a coder
for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
Using the default output Coder from the producing PTransform failed:
PTransform.getOutputCoder called.
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
at
org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
at
org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
at
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:142)
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDefaultMethod STANDARD_ERROR
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DEFAULT
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`,
`HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND
`HACKER_NEWS`.`score` > 2
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DEFAULT
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR],
expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)],
expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)],
expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5],
$condition=[$t21])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDefaultMethod FAILED
java.lang.IllegalStateException: Unable to return a default Coder for
BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output
[PCollection]. Correct one of the following root causes:
No Coder has been manually specified; you may do so using .setCoder().
Inferring a Coder from the CoderRegistry failed: Cannot provide a coder
for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
Using the default output Coder from the producing PTransform failed:
PTransform.getOutputCoder called.
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:278)
at
org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
at
org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
at
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
at
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:156)
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_ERROR
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`,
`HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND
`HACKER_NEWS`.`score` > 2
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
INFO: BigQuery method is set to: DIRECT_READ
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
BeamIOSourceRel(table=[[beam, HACKER_NEWS]])
Apr 02, 2020 1:11:00 AM
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by,
type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8,
'job'))>($5, 2)}, unsupported{}]])
Apr 02, 2020 1:11:01 AM
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable
buildIOReader
INFO: Pushing down the following filter: (`type` = 'story' OR `type` =
'job') AND `score` > 2
Apr 02, 2020 1:11:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Apr 02, 2020 1:11:01 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 206 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Apr 02, 2020 1:11:01 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
WARNING: Skipping non-existent file to stage ${dataflowWorkerJar}.
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-4Q9Oo0jmStzuBHdamOqrJw.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-tests-E2IYwlLzl-3iKpkBBRPMQA.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.21.0-SNAPSHOT-ccTG61QrWBuDXlF1rxhYmA.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.21.0-SNAPSHOT-B55kT8nrF625ITAB8L8Fvg.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-tests-tVUGlV2TQ9bjeJd7dbaHQQ.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.21.0-SNAPSHOT-swgozZqepoISLJE3erXobA.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.21.0-SNAPSHOT-DJxNeN--f8JQxBHRoZylsg.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-EHe9fNFEH9CFFvsroYrmRg.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.21.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.21.0-SNAPSHOT-tests-mWD-harklyKLAm8PQjn8XA.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-tests-1rXiiZVAvYByfADkjgaTkA.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.21.0-SNAPSHOT-tests-00EijxsT7iyyqAeZRR5lJQ.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.21.0-SNAPSHOT-tests-or3ffd9koFvFWzCzqtTgcA.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/classes/java/test>
to
gs://temp-storage-for-perf-tests/loadtests/staging/test-55pB7-5LQq8jwgISkwmhCA.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.21.0-SNAPSHOT-uY4rp3qhdNFkfMO3vE5c_w.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.21.0-SNAPSHOT-9bjTdcVcSDnWqCrsVoQJhw.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.21.0-SNAPSHOT-ZfPnPvcAvNMJaSGocp9KXA.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.21.0-SNAPSHOT-KHlhrj2ZJKiW54uW0ia4Vw.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.21.0-SNAPSHOT-2RVPVcsXwyS1bbUT8hV72A.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.21.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.21.0-SNAPSHOT-tests-p4yXMXwH1MwK09_aoQ8tUQ.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.21.0-SNAPSHOT-IYIBCKGWc6qdc-fIzxiXug.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT-unshaded.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-unshaded-rU2ZZDWdUL3FxZomx0Vbig.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.21.0-SNAPSHOT-4o8dy9M7ST9ledu1Wk_HkQ.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.21.0-SNAPSHOT-_pl1-R50JB9dAnAcRAFE0w.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.21.0-SNAPSHOT-aSsX1w0ovkvhcRHYLiTS_Q.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.21.0-SNAPSHOT-0wMcVLz3OrI78PalES5cnw.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT-tests.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-tests-l1AbkzNR4KQPOnM38AcIkg.jar
Apr 02, 2020 1:11:02 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.21.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.21.0-SNAPSHOT-XlN1t_Fk01njBieCkumNIw.jar
Apr 02, 2020 1:11:03 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 178 files cached, 27 files newly uploaded in
2 seconds
Apr 02, 2020 1:11:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource) as step s1
Apr 02, 2020 1:11:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
Apr 02, 2020 1:11:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
Apr 02, 2020 1:11:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s4
Apr 02, 2020 1:11:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://temp-storage-for-perf-tests/loadtests/staging/
Apr 02, 2020 1:11:04 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <12081 bytes, hash tnUK35e60HDRgDfbQbnaSg> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-tnUK35e60HDRgDfbQbnaSg.pb
Apr 02, 2020 1:11:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.21.0-SNAPSHOT
Apr 02, 2020 1:11:05 AM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$DefaultGcpRegionFactory
create
WARNING: Region will default to us-central1. Future releases of Beam will
require the user to set the region explicitly.
https://cloud.google.com/compute/docs/regions-zones/regions-zones
Apr 02, 2020 1:11:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-01_18_11_05-10148155721789860358?project=apache-beam-testing
Apr 02, 2020 1:11:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2020-04-01_18_11_05-10148155721789860358
Apr 02, 2020 1:11:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2020-04-01_18_11_05-10148155721789860358
Apr 02, 2020 1:11:07 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-04-02T01:11:05.318Z: The requested max number of workers (5)
is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
Apr 02, 2020 1:11:11 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:09.903Z: Checking permissions granted to controller
Service Account.
Apr 02, 2020 1:11:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:13.411Z: Worker configuration: n1-standard-1 in
us-central1-f.
Apr 02, 2020 1:11:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:14.212Z: Expanding CoGroupByKey operations into
optimizable parts.
Apr 02, 2020 1:11:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:14.306Z: Expanding GroupByKey operations into
optimizable parts.
Apr 02, 2020 1:11:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:14.342Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Apr 02, 2020 1:11:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:14.425Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Apr 02, 2020 1:11:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:14.453Z: Fusing consumer
BeamPushDownIOSourceRel_229/ParDo(RowMonitor) into
BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)
Apr 02, 2020 1:11:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:14.482Z: Fusing consumer BeamCalcRel_285/ParDo(Calc)
into BeamPushDownIOSourceRel_229/ParDo(RowMonitor)
Apr 02, 2020 1:11:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:14.524Z: Fusing consumer ParDo(TimeMonitor) into
BeamCalcRel_285/ParDo(Calc)
Apr 02, 2020 1:11:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:15.079Z: Executing operation
BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
Apr 02, 2020 1:11:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:15.148Z: Starting 5 workers in us-central1-f...
Apr 02, 2020 1:11:30 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-04-02T01:11:29.346Z: Your project already contains 100
Dataflow-created metric descriptors and Stackdriver will not create new
Dataflow custom metrics for this job. Each unique user-defined metric name
(independent of the DoFn in which it is defined) produces a new metric
descriptor. To delete old / unused metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Apr 02, 2020 1:11:39 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:38.442Z: Autoscaling: Raised the number of workers
to 4 based on the rate of progress in the currently running stage(s).
Apr 02, 2020 1:11:39 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:38.476Z: Resized worker pool to 4, though goal was
5. This could be a quota issue.
Apr 02, 2020 1:11:45 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:43.996Z: Autoscaling: Raised the number of workers
to 5 based on the rate of progress in the currently running stage(s).
Apr 02, 2020 1:11:59 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:57.986Z: Workers have started successfully.
Apr 02, 2020 1:11:59 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:11:58.020Z: Workers have started successfully.
Apr 02, 2020 1:12:45 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:12:44.420Z: Finished operation
BeamPushDownIOSourceRel_229/Read Input BQ Rows with
push-down/Read(BigQueryStorageTableSource)+BeamPushDownIOSourceRel_229/ParDo(RowMonitor)+BeamCalcRel_285/ParDo(Calc)+ParDo(TimeMonitor)
Apr 02, 2020 1:12:45 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:12:44.638Z: Cleaning up.
Apr 02, 2020 1:12:45 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:12:44.751Z: Stopping worker pool...
Apr 02, 2020 1:14:37 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:14:37.278Z: Autoscaling: Resized worker pool from 5 to
0.
Apr 02, 2020 1:14:38 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-04-02T01:14:37.344Z: Worker pool stopped.
Apr 02, 2020 1:14:42 AM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2020-04-01_18_11_05-10148155721789860358 finished with status
DONE.
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
> readUsingDirectReadMethodPushDown STANDARD_OUT
Load test results for test (ID): 3ba3bac2-65de-4215-bcdd-2c2ae501c131 and
timestamp: 2020-04-02T01:14:42.924000000Z:
Metric: Value:
fields_read 4375276.0
read_time 27.275
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED
3 tests completed, 2 failed
Finished generating test XML results (0.02 secs) into:
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into:
<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Execution worker
for ':' Thread 5,5,main]) completed. Took 3 mins 50.276 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 4m 49s
101 actionable tasks: 68 executed, 33 from cache
Publishing build scan...
https://gradle.com/s/3uuyquop77ark
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]