See 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/893/display/redirect?page=changes>

Changes:

[douglas.damon] Add Filter lesson to Go SDK katas

[douglas.damon] Reformat code for readability

[yoshiki.obata] [BEAM-9980] Configure Python versions for direct test suite 
tasks via

[douglas.damon] Change license info format in task-info.yaml

[douglas.damon] Update stepik course

[kevinsijo] [BEAM-9920] Enabling artifact staging for xlang transforms

[kevinsijo] documentation(beam/sdks/go): added apache license and function 
docstring

[kevinsijo] fix(beam/sdks/go): correcting changed function name

[noreply] Merge pull request #12581 from [BEAM-10378] Add Azure Blob Storage


------------------------------------------
[...truncated 291.97 KB...]

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT 
> readUsingDirectReadMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for 
BeamIOSourceRel_3/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output 
[PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder 
for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: 
PTransform.getOutputCoder called.
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at 
org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at 
org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at 
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at 
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDirectReadMethod(BigQueryIOPushDownIT.java:151)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT 
> readUsingDefaultMethod STANDARD_ERROR
    Aug 20, 2020 6:45:11 AM 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 6:45:11 AM 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:45:11 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, 
`HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND 
`HACKER_NEWS`.`score` > 2
    Aug 20, 2020 6:45:11 AM 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DEFAULT
    Aug 20, 2020 6:45:11 AM 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:45:11 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 6:45:12 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..13=[{inputs}], expr#14=['story':VARCHAR], 
expr#15=[=($t8, $t14)], expr#16=['job':VARCHAR], expr#17=[=($t8, $t16)], 
expr#18=[OR($t15, $t17)], expr#19=[2], expr#20=[>($t5, $t19)], 
expr#21=[AND($t18, $t20)], author=[$t4], type=[$t8], title=[$t0], score=[$t5], 
$condition=[$t21])
      BeamIOSourceRel(table=[[beam, HACKER_NEWS]])


org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT 
> readUsingDefaultMethod FAILED
    java.lang.IllegalStateException: Unable to return a default Coder for 
BeamIOSourceRel_95/ParDo(RowMonitor)/ParMultiDo(RowMonitor).output 
[PCollection]. Correct one of the following root causes:
      No Coder has been manually specified;  you may do so using .setCoder().
      Inferring a Coder from the CoderRegistry failed: Cannot provide a coder 
for a Beam Row. Please provide a schema instead using PCollection.setRowSchema.
      Using the default output Coder from the producing PTransform failed: 
PTransform.getOutputCoder called.
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkState(Preconditions.java:507)
        at org.apache.beam.sdk.values.PCollection.getCoder(PCollection.java:277)
        at 
org.apache.beam.sdk.values.PCollection.finishSpecifying(PCollection.java:114)
        at 
org.apache.beam.sdk.runners.TransformHierarchy.finishSpecifyingInput(TransformHierarchy.java:191)
        at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:541)
        at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:493)
        at 
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:69)
        at 
org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:39)
        at 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT.readUsingDefaultMethod(BigQueryIOPushDownIT.java:165)

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT 
> readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 6:45:12 AM 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 6:45:12 AM 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:45:12 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQL:
    SELECT `HACKER_NEWS`.`by` AS `author`, `HACKER_NEWS`.`type`, 
`HACKER_NEWS`.`title`, `HACKER_NEWS`.`score`
    FROM `beam`.`HACKER_NEWS` AS `HACKER_NEWS`
    WHERE (`HACKER_NEWS`.`type` = 'story' OR `HACKER_NEWS`.`type` = 'job') AND 
`HACKER_NEWS`.`score` > 2
    Aug 20, 2020 6:45:12 AM 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery method is set to: DIRECT_READ
    Aug 20, 2020 6:45:12 AM 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable <init>
    INFO: BigQuery writeDisposition is set to: WRITE_EMPTY
    Aug 20, 2020 6:45:12 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: SQLPlan>
    LogicalProject(author=[$4], type=[$8], title=[$0], score=[$5])
      LogicalFilter(condition=[AND(OR(=($8, 'story'), =($8, 'job')), >($5, 2))])
        BeamIOSourceRel(table=[[beam, HACKER_NEWS]])

    Aug 20, 2020 6:45:12 AM 
org.apache.beam.sdk.extensions.sql.impl.CalciteQueryPlanner convertToBeamRel
    INFO: BEAMPlan>
    BeamCalcRel(expr#0..3=[{inputs}], proj#0..3=[{exprs}])
      BeamPushDownIOSourceRel(table=[[beam, HACKER_NEWS]], usedFields=[[by, 
type, title, score]], BigQueryFilter=[[supported{OR(=($8, 'story'), =($8, 
'job'))>($5, 2)}, unsupported{}]])

    Aug 20, 2020 6:45:12 AM 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryTable 
buildIOReader
    INFO: Pushing down the following filter: (`type` = 'story' OR `type` = 
'job') AND `score` > 2
    Aug 20, 2020 6:45:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Aug 20, 2020 6:45:14 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 215 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Aug 20, 2020 6:45:15 AM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-worker.jar as 
beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-P2wToHwZKZ2-FoCTL-bgk-bRHFHM3VxT5R8_PuVQyKU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-tests-SlAkqjF6AoTQZi9ioyAfUIDwOrROEVBSvK7r6B2zJGo.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-mPM03APpmINxZoM80Agx3hyeQmn2s9WHQQlx0LTjspo.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/test5918634144078612724.zip to 
gs://temp-storage-for-perf-tests/loadtests/staging/test-XwprdssSb_l5-tQmDRtWGCrXhDSs4_Zv1CDTkZrpK_s.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-tests-nSAt7hFqXyS699unp0dVHFI2cZDMehWJdTnbpn0KF_A.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-C7UKSe1wEfGGxCCCJC_C4EYxOrcdS3InltUGzlVJwt4.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/java-fn-execution/build/libs/beam-runners-java-fn-execution-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-java-fn-execution-2.25.0-SNAPSHOT-jzwCq5i-tTcaYhHUww2cGwxyLyX8uj6rLRQB72leExw.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-grg356BKMmq5k5SGYtNncwpjisnw5nqz6yBVphu-rFU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-tests-YOPaSMVujI2iC6R8gFjJ8nGfy8YMSOvxdSw4OASpgRU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/protobuf/build/libs/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-I9NuYOlhh6zKVbzm5JzaQtsmZhEJxUfI3aLYyTBbEBM.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/common/build/libs/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-common-2.25.0-SNAPSHOT-tests-CEhmk1HYd_BFFk2uq4o2hqJ3Q5mGMDKMVyomrwICUvE.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/build/libs/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-sql-2.25.0-SNAPSHOT-STXvNjcMwUNX3G9SHF0xOvgO8Po4Q8cGA-0RJBEabFo.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/mongodb/build/libs/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-mongodb-2.25.0-SNAPSHOT-J5k1G_-rwKAvpd7MiTStS5gWuZlqWE500ipeT7dCd-Q.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-tests-QIQay9hSjon1Zzzt6EIfIBihRxmoZieottq3WaCpZtE.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-USur1pFTD4d9HlkmyHjC-IqHLFKL0uFw2z6Z2erqFD8.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/io/google-cloud-platform/build/libs/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-io-google-cloud-platform-2.25.0-SNAPSHOT-qA9eE76tomXFvNteG3KRwIKf9SSOR504_W3IxYE7uXg.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/core-construction-java/build/libs/beam-runners-core-construction-java-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-core-construction-java-2.25.0-SNAPSHOT-NT-yniuPRxyYpoWVnFKDy_hUPGDA8SriMyNos4XBR88.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/build/libs/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-2.25.0-SNAPSHOT-L_2ssk6fZgg6AxEx0GP6c4PkeueS_zb5BTZ86e18VJ0.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/worker/legacy-worker/build/libs/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.25.0-SNAPSHOT-P2wToHwZKZ2-FoCTL-bgk-bRHFHM3VxT5R8_PuVQyKU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/join-library/build/libs/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-join-library-2.25.0-SNAPSHOT-_m-xSm2yYBNKSbXC-t9BhExIw7LTuupTUWcneSDlCXk.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/runners/direct-java/build/libs/beam-runners-direct-java-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-direct-java-2.25.0-SNAPSHOT-q6v85PLsGfNUKAJqiXnh2yATk1TbOR0ETvq-MnuyXVc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-xYn2sJ1grt60XpMjYkQAQA1CAUOYSdG_q5zcwKqWlsM.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/expansion-service/build/libs/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-expansion-service-2.25.0-SNAPSHOT-dsKdvzpwTKjeXmNQXv0FEvWtj_oqnCgJ3nI38YXztQU.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/testing/test-utils/build/libs/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-test-utils-2.25.0-SNAPSHOT-tests-9jS-bMecGsPTiwkVFpecFpyyB0VfZT5tf_Z4X-Bv_u4.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/google-cloud-platform-core/build/libs/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.25.0-SNAPSHOT-tests-oVTGoIzIMhqiKZU2ySauF-lSCXqZ92mPpeofR3tlTV8.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/vendor/sdks-java-extensions-protobuf/build/libs/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-vendor-sdks-java-extensions-protobuf-2.25.0-SNAPSHOT-pJjbKkRaIreKqWKO03Atd3vGIgx-t8fQZyzYZvxUHEg.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-healthcare/v1beta1-rev20200713-1.30.10/e91837f3ef70393435477f24da1d8037079de665/google-api-services-healthcare-v1beta1-rev20200713-1.30.10.jar
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-healthcare-v1beta1-rev20200713-1.30.10-JURWP1dWIZvDBGhzx49rM9wlx_3gzI-7LaDigoy5pnc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-bigquery/v2-rev20200719-1.30.10/ef24b5e53a2e3133e8f3a48a5c7fb97c9a20efc9/google-api-services-bigquery-v2-rev20200719-1.30.10.jar
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-bigquery-v2-rev20200719-1.30.10-UO0rHodDXrS1DVt-XgTxulehStfKZiU0mkIbK9uOQ34.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-unshaded-pRphU9STc1m-sUagrEKPhtRklUBiBaD894bJ0M2v0SE.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-clouddebugger/v2-rev20200501-1.30.10/b33ff2dddd08848ea3f267bbddd636554eab43d8/google-api-services-clouddebugger-v2-rev20200501-1.30.10.jar
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-clouddebugger-v2-rev20200501-1.30.10-bU7wrCFf0eFwDFdYM5XXmUMgLpTGll41LtaPtLf_5Yc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client-java6/1.30.10/bab94eb59b41c2a0087ce53588ff33d7af15c13b/google-api-client-java6-1.30.10.jar
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/google-api-client-java6-1.30.10-m2I4oHqNXjz0pJ-TP3zK5_cofaMLXRJwTHexGAhXg04.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-storage/v1-rev20200611-1.30.10/189adae8ea043984d3814717203b470ab0517373/google-api-services-storage-v1-rev20200611-1.30.10.jar
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-storage-v1-rev20200611-1.30.10-INrcnpWLWvkN-Mf-ig-ab-lYg9NBuztZDHchmNhVCVc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-9QEEC7iYYk7zP7-hOsXgs2Kehcq7D0Lvwvzbq5iHbOY.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.apis/google-api-services-dataflow/v1b3-rev20200713-1.30.10/9c7cdb809d0594a4c3347c299c85d89e85617769/google-api-services-dataflow-v1b3-rev20200713-1.30.10.jar
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-dataflow-v1b3-rev20200713-1.30.10-7G9SCylELI7xiLyl0DMKUipZJEhBA78X2SidesnpR-E.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.api-client/google-api-client-jackson2/1.30.10/9e2d0aa4eaf242bd76668afab8da3a8a8f18cf11/google-api-client-jackson2-1.30.10.jar
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/google-api-client-jackson2-1.30.10-VpK9R1T0BTfZBKYo0BYUAFSlh5Bc_I_Q_fdy5_Z9rTc.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-TKLcpI4nF18BLOydmOl4Qb3C8tk7nmL6sAwgH7_E6xo.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-MAFLKokIB9dNmD_LOoNwYlMAkiW2JB7q3ffhK56jl9Q.jar
    Aug 20, 2020 6:45:15 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-RqBrqXgeyq3KOJCBVPdxvj5Qvf6RadZEIHG1uyJyqsA.jar
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 178 files cached, 37 files newly uploaded in 
1 seconds
    Aug 20, 2020 6:45:16 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/Read Input BQ Rows with 
push-down/Read(BigQueryStorageTableSource) as step s1
    Aug 20, 2020 6:45:16 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamPushDownIOSourceRel_229/ParDo(RowMonitor) as step s2
    Aug 20, 2020 6:45:16 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding BeamCalcRel_285/ParDo(Calc) as step s3
    Aug 20, 2020 6:45:16 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s4
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <93834 bytes, hash 
064087aaad7040fe55e5c7e4dad4e4ac3e1c5313b042327def1a179d76d692a0> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-BkCHqq1wQP5V5cfk2tTkrD4cUxOwQjJ97xoXnXbWkqA.pb
    Aug 20, 2020 6:45:16 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 20, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_23_45_16-16938772797866909794?project=apache-beam-testing
    Aug 20, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-19_23_45_16-16938772797866909794
    Aug 20, 2020 6:45:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2020-08-19_23_45_16-16938772797866909794
    Aug 20, 2020 6:45:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T06:45:16.683Z: The requested max number of workers (5) 
is ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 20, 2020 6:45:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2020-08-20T06:45:21.881Z: Staged package 
google-api-services-cloudresourcemanager-v1-rev20200720-1.30.10-PQqVlsOIu2M59XLbEl59CW6wNmY5OYy_R71khhZQ8-Y.jar
 at location 
'gs://temp-storage-for-perf-tests/loadtests/staging/google-api-services-cloudresourcemanager-v1-rev20200720-1.30.10-PQqVlsOIu2M59XLbEl59CW6wNmY5OYy_R71khhZQ8-Y.jar'
 is inaccessible.
    Aug 20, 2020 6:45:25 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2020-08-20T06:45:24.082Z: Workflow failed. Causes: One or more 
access checks for temp location or staged files failed. Please refer to other 
error messages for details. For more information on security and permissions, 
please see https://cloud.google.com/dataflow/security-and-permissions.
    Aug 20, 2020 6:45:25 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T06:45:24.113Z: Cleaning up.
    Aug 20, 2020 6:45:25 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-20T06:45:24.180Z: Worker pool stopped.
    Aug 20, 2020 6:45:28 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-20T06:45:28.006Z: Your project already contains 100 
Dataflow-created metric descriptors and Stackdriver will not create new 
Dataflow custom metrics for this job. Each unique user-defined metric name 
(independent of the DoFn in which it is defined) produces a new metric 
descriptor. To delete old / unused metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 20, 2020 6:45:32 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2020-08-19_23_45_16-16938772797866909794 failed with status 
FAILED.
    Aug 20, 2020 6:45:32 AM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT
    Aug 20, 2020 6:45:32 AM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
    SEVERE: Failed to get metric fields_read, from namespace 
org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT 
> readUsingDirectReadMethodPushDown STANDARD_OUT
    Load test results for test (ID): 92bd21cd-2799-499d-99bb-0057fecc79d9 and 
timestamp: 2020-08-20T06:45:32.410000000Z:
                     Metric:                    Value:
                 fields_read                      -1.0
                   read_time                       0.0

org.apache.beam.sdk.extensions.sql.meta.provider.bigquery.BigQueryIOPushDownIT 
> readUsingDirectReadMethodPushDown STANDARD_ERROR
    Aug 20, 2020 6:45:32 AM 
org.apache.beam.sdk.testutils.publishing.InfluxDBPublisher publishWithCheck
    WARNING: Missing property -- measurement/database. Metrics won't be 
published.

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:extensions:sql:perf-tests:integrationTest FAILED

3 tests completed, 2 failed
Finished generating test XML results (0.018 secs) into: 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: 
<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest>
:sdks:java:extensions:sql:perf-tests:integrationTest (Thread[Daemon 
worker,5,main]) completed. Took 29.202 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:extensions:sql:perf-tests:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_SQLBigQueryIO_Batch_Performance_Test_Java/ws/src/sdks/java/extensions/sql/perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 15s
106 actionable tasks: 70 executed, 36 from cache

Publishing build scan...
https://gradle.com/s/7okw5cb2wvoqe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to