See 
<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi/4587/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-9199] Require Dataflow --region in Python SDK.

[kcweaver] Add --region to tests where needed.

[kcweaver] [BEAM-9199] Require --region option for Dataflow in Java SDK.

[kcweaver] Add --region to Java GCP tests.

[kcweaver] Fix DataflowRunnerTest.

[kcweaver] Fix more Java unit tests missing --region.

[kcweaver] Add --region to DF streaming example tests.

[kcweaver] Add unit tests for get_default_gcp_region

[kcweaver] Add --region to Dataflow runner webpage.

[kcweaver] lint

[kcweaver] Add --region to more Java tests and examples.

[kcweaver] Add --region to more Python tests and examples.

[kcweaver] format

[kcweaver] Remove unrecognized --region option from non-DF tests.

[github] [BEAM-9685] remove Go SDK container from release process (#11308)


------------------------------------------
[...truncated 58.49 KB...]
> Task :sdks:java:container:resolveBuildDependencies
Resolving 
./github.com/apache/beam/sdks/go@<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi/ws/src/sdks/go>

> Task :sdks:java:container:installDependencies
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:dockerPrepare
> Task :sdks:java:container:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 
18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
1d730684374b: Preparing
cd7c3923575c: Preparing
14fa3b8c74c8: Preparing
090b7bbcd4c4: Preparing
e8c9fca69e47: Preparing
d138ed12138e: Preparing
704d7fa38b8c: Preparing
7e6d56f5d0db: Preparing
892007193bb6: Preparing
e811ee12aa10: Preparing
23f8d486123a: Preparing
afae6f50abb9: Preparing
136a15f81f25: Preparing
185574602537: Preparing
24efcd549ab5: Preparing
e811ee12aa10: Waiting
afae6f50abb9: Waiting
23f8d486123a: Waiting
136a15f81f25: Waiting
24efcd549ab5: Waiting
d138ed12138e: Waiting
185574602537: Waiting
7e6d56f5d0db: Waiting
892007193bb6: Waiting
704d7fa38b8c: Waiting
14fa3b8c74c8: Layer already exists
090b7bbcd4c4: Layer already exists
1d730684374b: Layer already exists
e8c9fca69e47: Layer already exists
cd7c3923575c: Layer already exists
d138ed12138e: Layer already exists
704d7fa38b8c: Layer already exists
7e6d56f5d0db: Layer already exists
892007193bb6: Layer already exists
e811ee12aa10: Layer already exists
23f8d486123a: Layer already exists
afae6f50abb9: Layer already exists
24efcd549ab5: Layer already exists
136a15f81f25: Layer already exists
185574602537: Layer already exists
20200406103057: digest: 
sha256:fba8686a985756bb1f1880c5c06dc68f6804d34b96d63317300c03615b07fce2 size: 
3470
56f9afae918a: Preparing
3d7fade9ce03: Preparing
1c564c2e95fc: Preparing
07abfed94da3: Preparing
ebdea2a60926: Preparing
6e61ed38f788: Preparing
ba83fcc15c61: Preparing
a99ef8bcb5a1: Preparing
6e61ed38f788: Waiting
892007193bb6: Preparing
e811ee12aa10: Preparing
ba83fcc15c61: Waiting
a99ef8bcb5a1: Waiting
23f8d486123a: Preparing
afae6f50abb9: Preparing
e811ee12aa10: Waiting
136a15f81f25: Preparing
185574602537: Preparing
892007193bb6: Waiting
23f8d486123a: Waiting
136a15f81f25: Waiting
afae6f50abb9: Waiting
185574602537: Waiting
24efcd549ab5: Preparing
07abfed94da3: Layer already exists
3d7fade9ce03: Layer already exists
ebdea2a60926: Layer already exists
56f9afae918a: Layer already exists
892007193bb6: Layer already exists
afae6f50abb9: Layer already exists
e811ee12aa10: Layer already exists
23f8d486123a: Layer already exists
136a15f81f25: Layer already exists
185574602537: Layer already exists
24efcd549ab5: Layer already exists
6e61ed38f788: Layer already exists
1c564c2e95fc: Layer already exists
ba83fcc15c61: Layer already exists
a99ef8bcb5a1: Layer already exists
20200406150117: digest: 
sha256:aa3aab18d199dcdf3188d1f6514f562cf79eaa62211550170bc7b447bef008dc size: 
3470
bc751af1b533: Preparing
33d58848545f: Preparing
40047668bb6d: Preparing
6f3d47db8e69: Preparing
944f3905a976: Preparing
3e73aca50788: Preparing
b73d8c1bbfe2: Preparing
14a22486f57e: Preparing
892007193bb6: Preparing
e811ee12aa10: Preparing
23f8d486123a: Preparing
afae6f50abb9: Preparing
136a15f81f25: Preparing
185574602537: Preparing
24efcd549ab5: Preparing
afae6f50abb9: Waiting
136a15f81f25: Waiting
14a22486f57e: Waiting
185574602537: Waiting
892007193bb6: Waiting
e811ee12aa10: Waiting
23f8d486123a: Waiting
24efcd549ab5: Waiting
3e73aca50788: Waiting
b73d8c1bbfe2: Waiting
bc751af1b533: Pushed
6f3d47db8e69: Pushed
33d58848545f: Pushed
944f3905a976: Pushed
e811ee12aa10: Layer already exists
afae6f50abb9: Layer already exists
23f8d486123a: Layer already exists
136a15f81f25: Layer already exists
185574602537: Layer already exists
24efcd549ab5: Layer already exists
892007193bb6: Layer already exists
40047668bb6d: Pushed
b73d8c1bbfe2: Pushed
14a22486f57e: Pushed
3e73aca50788: Pushed
20200407005004: digest: 
sha256:daea0aa5388ae4e5671b822eaf350509356adc7897a1a7df80347ce2fa1a5d1b size: 
3470

> Task 
> :runners:google-cloud-dataflow-java:coreSDKJavaFnApiWorkerIntegrationTest 
> NO-SOURCE
> Task 
> :runners:google-cloud-dataflow-java:examplesJavaFnApiWorkerIntegrationTest

org.apache.beam.examples.cookbook.BigQueryTornadoesIT > 
testE2EBigQueryTornadoesWithExport FAILED
    java.lang.RuntimeException at BigQueryTornadoesIT.java:68

org.apache.beam.examples.cookbook.BigQueryTornadoesIT > 
testE2eBigQueryTornadoesWithStorageApi FAILED
    java.lang.RuntimeException at BigQueryTornadoesIT.java:68

2 tests completed, 2 failed

> Task 
> :runners:google-cloud-dataflow-java:examplesJavaFnApiWorkerIntegrationTest 
> FAILED
> Task 
> :runners:google-cloud-dataflow-java:googleCloudPlatformFnApiWorkerIntegrationTest

org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteIT > testE2EBigtableWrite 
FAILED
    java.lang.RuntimeException at BigtableWriteIT.java:130

org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT > 
testE2EBigQueryClusteringTableFunction FAILED
    java.lang.RuntimeException at BigQueryTimePartitioningClusteringIT.java:199

org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaUpdateOptionsIT > 
testAllowFieldAddition FAILED
    java.lang.RuntimeException at BigQuerySchemaUpdateOptionsIT.java:154

org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT > 
testNewTypesQueryWithReshuffle FAILED
    java.lang.RuntimeException at BigQueryToTableIT.java:118

org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaUpdateOptionsIT > 
testAllowFieldRelaxation FAILED
    java.lang.RuntimeException at BigQuerySchemaUpdateOptionsIT.java:154

org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT > 
testE2EBigQueryClusteringDynamicDestinations FAILED
    java.lang.RuntimeException at BigQueryTimePartitioningClusteringIT.java:222

org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT > 
testLegacyQueryWithoutReshuffle FAILED
    java.lang.RuntimeException at BigQueryToTableIT.java:118

org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT > 
testE2EBigQueryTimePartitioning FAILED
    java.lang.RuntimeException at BigQueryTimePartitioningClusteringIT.java:145

org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT > 
testNewTypesQueryWithoutReshuffle FAILED
    java.lang.RuntimeException at BigQueryToTableIT.java:118

org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT > 
testE2EBigQueryClustering FAILED
    java.lang.RuntimeException at BigQueryTimePartitioningClusteringIT.java:170

org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT > 
testStandardQueryWithoutCustom FAILED
    java.lang.RuntimeException at BigQueryToTableIT.java:118

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > 
testE2EV1WriteWithLargeEntities FAILED
    java.lang.RuntimeException at V1WriteIT.java:104

org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write FAILED
    java.lang.RuntimeException at V1WriteIT.java:69

13 tests completed, 13 failed

> Task 
> :runners:google-cloud-dataflow-java:googleCloudPlatformFnApiWorkerIntegrationTest
>  FAILED

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20200407005004
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:daea0aa5388ae4e5671b822eaf350509356adc7897a1a7df80347ce2fa1a5d1b
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:daea0aa5388ae4e5671b822eaf350509356adc7897a1a7df80347ce2fa1a5d1b
  Associated tags:
 - 20200407005004
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20200407005004
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20200407005004].
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:daea0aa5388ae4e5671b822eaf350509356adc7897a1a7df80347ce2fa1a5d1b].

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:examplesJavaFnApiWorkerIntegrationTest'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/examplesJavaFnApiWorkerIntegrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:googleCloudPlatformFnApiWorkerIntegrationTest'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PostCommit_Java_PortabilityApi/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformFnApiWorkerIntegrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 34m 42s
103 actionable tasks: 73 executed, 29 from cache, 1 up-to-date

Publishing build scan...
https://gradle.com/s/ebisoi2qvpbmq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to