See <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/10738/display/redirect?page=changes>
Changes: [noreply] [Go SDK] Timers with new datalayer (#26101) ------------------------------------------ [...truncated 396.83 KB...] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/mock-2.0.0-py2.py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/mock-2.0.0-py2.py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/PyHamcrest-1.10.1-py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/PyHamcrest-1.10.1-py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/parameterized-0.7.5-py2.py3-none-any.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/parameterized-0.7.5-py2.py3-none-any.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/dataflow_python_sdk.tar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/dataflow_python_sdk.tar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:734 Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:753 Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0428144210-840290-4kdurm23.1682692930.840500/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:909 Create job: <Job clientRequestId: '20230428144210841585-3036' createTime: '2023-04-28T14:42:12.989553Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-04-28_07_42_12-17996934847928121077' location: 'us-central1' name: 'beamapp-jenkins-0428144210-840290-4kdurm23' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-04-28T14:42:12.989553Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:911 Created job with id: [2023-04-28_07_42_12-17996934847928121077] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:912 Submitted job: 2023-04-28_07_42_12-17996934847928121077 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:918 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-28_07_42_12-17996934847928121077?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-28_07_42_12-17996934847928121077?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-04-28_07_42_12-17996934847928121077 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:13.478Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2023-04-28_07_42_12-17996934847928121077. The number of workers will be between 1 and 1000. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:13.586Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2023-04-28_07_42_12-17996934847928121077. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:15.417Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:16.788Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:16.823Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/CoGroupByKeyImpl/GroupByKey: GroupByKey not followed by a combiner. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:16.858Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:16.886Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:16.949Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:16.989Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.025Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input s10.None [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.060Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify, through flatten assert_that/Group/CoGroupByKeyImpl/Flatten, into producer assert_that/Group/CoGroupByKeyImpl/Tag[1] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.094Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/GroupByKey/GroupByWindow into assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.126Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) into assert_that/Group/CoGroupByKeyImpl/GroupByKey/GroupByWindow [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.158Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/RestoreTags into assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.190Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/RestoreTags [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.222Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.252Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for input s12-reify-value0-c11 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.287Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write, through flatten assert_that/Group/CoGroupByKeyImpl/Flatten/Unzipped-1, into producer assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.318Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify into assert_that/Group/CoGroupByKeyImpl/Tag[0] [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.351Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write into assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.383Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Tag[0] into assert_that/Create/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.419Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at sideinputs_test.py:259>)/Map(<lambda at sideinputs_test.py:259>) into main input/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.453Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Map(<lambda at sideinputs_test.py:259>)/Map(<lambda at sideinputs_test.py:259>) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.483Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.515Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/CoGroupByKeyImpl/Tag[1] into assert_that/ToVoidKey [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.556Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.582Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.603Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.635Z: JOB_MESSAGE_DEBUG: Assigning stage ids. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.771Z: JOB_MESSAGE_DEBUG: Executing wait step start21 [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.837Z: JOB_MESSAGE_BASIC: Executing operation side list/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.864Z: JOB_MESSAGE_BASIC: Finished operation side list/Read [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.926Z: JOB_MESSAGE_DEBUG: Value "side list/Read.out" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:17.991Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:259>)/_UnpickledSideInput(Read.out.0) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.024Z: JOB_MESSAGE_BASIC: Executing operation Map(<lambda at sideinputs_test.py:259>)/_UnpickledSideInput(Read.out.1) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.025Z: JOB_MESSAGE_BASIC: Finished operation Map(<lambda at sideinputs_test.py:259>)/_UnpickledSideInput(Read.out.0) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.050Z: JOB_MESSAGE_BASIC: Finished operation Map(<lambda at sideinputs_test.py:259>)/_UnpickledSideInput(Read.out.1) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.079Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:259>)/_UnpickledSideInput(Read.out.0).output" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.114Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at sideinputs_test.py:259>)/_UnpickledSideInput(Read.out.1).output" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.168Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.212Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.245Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.607Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.675Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/CoGroupByKeyImpl/GroupByKey/Session" materialized. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.739Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:18.771Z: JOB_MESSAGE_BASIC: Executing operation main input/Read+Map(<lambda at sideinputs_test.py:259>)/Map(<lambda at sideinputs_test.py:259>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:25.379Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:57.223Z: JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-c failed to bring up any of the desired 1 workers. Please refer to https://cloud.google.com/dataflow/docs/guides/common-errors#worker-pool-failure for help troubleshooting. ZONE_RESOURCE_POOL_EXHAUSTED_WITH_DETAILS: Instance 'beamapp-jenkins-042814421-04280742-3lj6-harness-pz2s' creation failed: The zone 'projects/apache-beam-testing/zones/us-central1-a' does not have enough resources available to fulfill the request. '(resource type:compute)'. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:57.250Z: JOB_MESSAGE_ERROR: Workflow failed. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:57.315Z: JOB_MESSAGE_BASIC: Finished operation main input/Read+Map(<lambda at sideinputs_test.py:259>)/Map(<lambda at sideinputs_test.py:259>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:57.315Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Reify+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:57.386Z: JOB_MESSAGE_DETAILED: Cleaning up. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:57.449Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:42:57.481Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:43:15.954Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:238 2023-04-28T14:43:16.001Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:193 Job 2023-04-28_07_42_12-17996934847928121077 is in state JOB_STATE_FAILED [1m[31mERROR [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:1554 Console URL: https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-04-28_07_42_12-17996934847928121077?project=<ProjectId> [33m=============================== warnings summary ===============================[0m ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py:15 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/hdfs/config.py>:15: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses from imp import load_source ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:121 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:121 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:121 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:121 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:121 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:121 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:121 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:121 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:121 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:121: DeprecationWarning: pkg_resources is deprecated as an API warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning) ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870: 162 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870: 117 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2349: 36 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2349: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(parent) ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py:2870 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/pkg_resources/__init__.py>:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.iam')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20 ../../build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py:20 <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/google/rpc/__init__.py>:20: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.rpc')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages pkg_resources.declare_namespace(__name__) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/pytest_validatesRunnerBatchTests-df-py37.xml> - [36m[1m=========================== short test summary info ============================[0m [31mFAILED[0m apache_beam/transforms/ptransform_test.py::[1mPTransformTest::test_flatten_a_flattened_pcollection[0m - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error: Workflow failed. [31mFAILED[0m apache_beam/transforms/ptransform_test.py::[1mPTransformTest::test_multiple_empty_outputs[0m - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error: Workflow failed. [31mFAILED[0m apache_beam/transforms/ptransform_test.py::[1mPTransformTest::test_par_do_with_multiple_outputs_and_using_yield[0m - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error: Workflow failed. [31mFAILED[0m apache_beam/transforms/sideinputs_test.py::[1mSideInputsTest::test_as_singleton_with_different_defaults[0m - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error: Workflow failed. [31mFAILED[0m apache_beam/transforms/sideinputs_test.py::[1mSideInputsTest::test_as_dict_twice[0m - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error: Workflow failed. [31mFAILED[0m apache_beam/transforms/ptransform_test.py::[1mPTransformTest::test_undeclared_outputs[0m - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error: Workflow failed. [31mFAILED[0m apache_beam/transforms/sideinputs_test.py::[1mSideInputsTest::test_as_singleton_without_unique_labels[0m - apache_beam.runners.dataflow.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed. State: FAILED, Error: Workflow failed. [31m===== [31m[1m7 failed[0m, [32m27 passed[0m, [33m13 skipped[0m, [33m360 warnings[0m[31m in 2354.40s (0:39:14)[0m[31m ======[0m > Task :sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 242 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py39:validatesRunnerBatchTests'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. ============================================================================== 2: Task failed with an exception. ----------- * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 242 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:validatesRunnerBatchTests'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 42m 9s 18 actionable tasks: 12 executed, 4 from cache, 2 up-to-date Build scan background action failed. java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590) at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557) at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230) at java.lang.reflect.WeakCache.get(WeakCache.java:127) at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419) at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719) at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64) at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59) at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Publishing build scan... https://gradle.com/s/vou6scyqcrpgi Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
