See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/65/display/redirect?page=changes>
Changes: [noreply] Add Java PVR Flink Batch action (#28221) ------------------------------------------ [...truncated 234.75 KB...] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job with id: [2023-08-31_07_56_32-7013978121211156833] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 2023-08-31_07_56_32-7013978121211156833 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-31_07_56_32-7013978121211156833?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-31_07_56_32-7013978121211156833?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-31_07_56_32-7013978121211156833?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 2023-08-31_07_56_32-7013978121211156833 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T14:56:37.121Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T14:56:39.772Z: JOB_MESSAGE_BASIC: Executing operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T14:56:39.829Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T14:56:40.049Z: JOB_MESSAGE_BASIC: Finished operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T14:56:47.834Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T14:56:49.001Z: JOB_MESSAGE_BASIC: Executing operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Impulse+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/MapElements/Map/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split/ParMultiDo(Split)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key/ParMultiDo(AssignShard)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Reify+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:50.956Z: JOB_MESSAGE_BASIC: Finished operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Impulse+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/MapElements/Map/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split/ParMultiDo(Split)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key/ParMultiDo(AssignShard)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Reify+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:51.005Z: JOB_MESSAGE_BASIC: Executing operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:52.017Z: JOB_MESSAGE_BASIC: Finished operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:52.071Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:52.185Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:52.290Z: JOB_MESSAGE_BASIC: Executing operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Read+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Read/ParMultiDo(Read)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/StripIds/ParMultiDo(StripIds)+ReadFromKafka/Remove Kafka Metadata/ParMultiDo(Anonymous)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:52.310Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:3736>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:52.954Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:3736>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:56.700Z: JOB_MESSAGE_BASIC: Finished operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Read+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Read/ParMultiDo(Read)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/StripIds/ParMultiDo(StripIds)+ReadFromKafka/Remove Kafka Metadata/ParMultiDo(Anonymous)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:56.813Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:57.101Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:57.154Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:57.675Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:00:57.880Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:03:03.458Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 2023-08-31_07_56_32-7013978121211156833 is in state JOB_STATE_DONE [32mPASSED[0m apache_beam/io/external/xlang_kafkaio_it_test.py::CrossLanguageKafkaIOTest::test_hosted_kafkaio_populated_key [1m-------------------------------- live log call ---------------------------------[0m [32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/build/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"> to staging location. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:397 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild [32mINFO [0m root:environments.py:313 Using provided Python SDK container image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest [32mINFO [0m root:environments.py:320 Python SDK container image set to "gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker environment [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function pack_combiners at 0x7f0736672b80> ==================== [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function sort_stages at 0x7f07366733a0> ==================== [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/icedtea-sound-uoIQCGhMffp0D_gc1dkBtXBGXr18Bkb5Qx-e-CvP00s.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/icedtea-sound-uoIQCGhMffp0D_gc1dkBtXBGXr18Bkb5Qx-e-CvP00s.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/jaccess-T1qVdaqZDj2uNq1tdHEk1lwFMYQiwMhYa0jqGU9fUMg.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/jaccess-T1qVdaqZDj2uNq1tdHEk1lwFMYQiwMhYa0jqGU9fUMg.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/localedata-dQT9YOGvbd1zEqcZsyJ1W4l8jNhxoX3b1nmMhdIbCb8.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/localedata-dQT9YOGvbd1zEqcZsyJ1W4l8jNhxoX3b1nmMhdIbCb8.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/nashorn-gZxbPAWCb1KP7OCbxUu1prmT4YMEDX-mGdJFjkB3pfs.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/nashorn-gZxbPAWCb1KP7OCbxUu1prmT4YMEDX-mGdJFjkB3pfs.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/cldrdata-_GVjPlS0invvNh7wlTfv_CNVOxTj3Y-xvSW1Xw2r_-4.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/cldrdata-_GVjPlS0invvNh7wlTfv_CNVOxTj3Y-xvSW1Xw2r_-4.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/dnsns-lI-xDSEZtu3C8A4-qt_7RcyP_ql7OU0LNZreNR5piTU.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/dnsns-lI-xDSEZtu3C8A4-qt_7RcyP_ql7OU0LNZreNR5piTU.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/beam-sdks-java-io-expansion-service-2.51.0-SNAPSHOT-spnynoPaQ81vsCeMGr3Sc6dcYP30FbFTvqGL5lAJ9Q4.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/beam-sdks-java-io-expansion-service-2.51.0-SNAPSHOT-spnynoPaQ81vsCeMGr3Sc6dcYP30FbFTvqGL5lAJ9Q4.jar in 6 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831150315-674256-durm23ee.1693494195.674415/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: <Job clientRequestId: '20230831150315675258-9154' createTime: '2023-08-31T15:03:24.387367Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-08-31_08_03_23-10057479879011415589' location: 'us-central1' name: 'beamapp-jenkins-0831150315-674256-durm23ee' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-08-31T15:03:24.387367Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job with id: [2023-08-31_08_03_23-10057479879011415589] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 2023-08-31_08_03_23-10057479879011415589 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-31_08_03_23-10057479879011415589?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-31_08_03_23-10057479879011415589?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-31_08_03_23-10057479879011415589?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 2023-08-31_08_03_23-10057479879011415589 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:03:27.992Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:03:29.864Z: JOB_MESSAGE_BASIC: Executing operation Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:03:29.924Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:03:30.205Z: JOB_MESSAGE_BASIC: Finished operation Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:03:39.097Z: JOB_MESSAGE_BASIC: Executing operation Generate/Impulse+Generate/FlatMap(<lambda at core.py:3736>)+Generate/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:03:43.803Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:07:55.789Z: JOB_MESSAGE_BASIC: Finished operation Generate/Impulse+Generate/FlatMap(<lambda at core.py:3736>)+Generate/MaybeReshuffle/Reshuffle/AddRandomKeys+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:07:55.851Z: JOB_MESSAGE_BASIC: Executing operation Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:07:56.755Z: JOB_MESSAGE_BASIC: Finished operation Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:07:56.806Z: JOB_MESSAGE_BASIC: Executing operation Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate/Map(decode)+MakeKV+WriteToKafka/Kafka ProducerRecord/Map/ParMultiDo(Anonymous)+WriteToKafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:07:59.850Z: JOB_MESSAGE_BASIC: Finished operation Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Generate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Generate/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Generate/Map(decode)+MakeKV+WriteToKafka/Kafka ProducerRecord/Map/ParMultiDo(Anonymous)+WriteToKafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter) [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:08:00.020Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:10:21.155Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 2023-08-31_08_03_23-10057479879011415589 is in state JOB_STATE_DONE [32mINFO [0m apache_beam.runners.portability.stager:stager.py:322 Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/build/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl"> to staging location. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:397 Pipeline has additional dependencies to be installed in SDK worker container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild [32mINFO [0m root:environments.py:313 Using provided Python SDK container image: gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest [32mINFO [0m root:environments.py:320 Python SDK container image set to "gcr.io/apache-beam-testing/beam-sdk/beam_python3.8_sdk:latest" for Docker environment [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function pack_combiners at 0x7f0736672b80> ==================== [32mINFO [0m apache_beam.runners.portability.fn_api_runner.translations:translations.py:710 ==================== <function sort_stages at 0x7f07366733a0> ==================== [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/icedtea-sound-uoIQCGhMffp0D_gc1dkBtXBGXr18Bkb5Qx-e-CvP00s.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/icedtea-sound-uoIQCGhMffp0D_gc1dkBtXBGXr18Bkb5Qx-e-CvP00s.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/jaccess-T1qVdaqZDj2uNq1tdHEk1lwFMYQiwMhYa0jqGU9fUMg.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/jaccess-T1qVdaqZDj2uNq1tdHEk1lwFMYQiwMhYa0jqGU9fUMg.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/localedata-dQT9YOGvbd1zEqcZsyJ1W4l8jNhxoX3b1nmMhdIbCb8.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/localedata-dQT9YOGvbd1zEqcZsyJ1W4l8jNhxoX3b1nmMhdIbCb8.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/nashorn-gZxbPAWCb1KP7OCbxUu1prmT4YMEDX-mGdJFjkB3pfs.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/nashorn-gZxbPAWCb1KP7OCbxUu1prmT4YMEDX-mGdJFjkB3pfs.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/cldrdata-_GVjPlS0invvNh7wlTfv_CNVOxTj3Y-xvSW1Xw2r_-4.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/cldrdata-_GVjPlS0invvNh7wlTfv_CNVOxTj3Y-xvSW1Xw2r_-4.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/dnsns-lI-xDSEZtu3C8A4-qt_7RcyP_ql7OU0LNZreNR5piTU.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/dnsns-lI-xDSEZtu3C8A4-qt_7RcyP_ql7OU0LNZreNR5piTU.jar in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/beam-sdks-java-io-expansion-service-2.51.0-SNAPSHOT-spnynoPaQ81vsCeMGr3Sc6dcYP30FbFTvqGL5lAJ9Q4.jar... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/beam-sdks-java-io-expansion-service-2.51.0-SNAPSHOT-spnynoPaQ81vsCeMGr3Sc6dcYP30FbFTvqGL5lAJ9Q4.jar in 3 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/apache_beam-2.51.0.dev0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:668 Starting GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/pipeline.pb... [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:684 Completed GCS upload to gs://dataflow-staging-us-central1-77b801c0838aee13391c0d1885860494/beamapp-jenkins-0831151036-675188-9wcd0cao.1693494636.675349/pipeline.pb in 0 seconds. [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:843 Create job: <Job clientRequestId: '20230831151036676247-9503' createTime: '2023-08-31T15:10:42.067697Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-08-31_08_10_41-5809094679400334409' location: 'us-central1' name: 'beamapp-jenkins-0831151036-675188-9wcd0cao' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-08-31T15:10:42.067697Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:845 Created job with id: [2023-08-31_08_10_41-5809094679400334409] [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:846 Submitted job: 2023-08-31_08_10_41-5809094679400334409 [32mINFO [0m apache_beam.runners.dataflow.internal.apiclient:apiclient.py:847 To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-31_08_10_41-5809094679400334409?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-31_08_10_41-5809094679400334409?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:58 Console log: [32mINFO [0m apache_beam.runners.dataflow.test_dataflow_runner:test_dataflow_runner.py:59 https://console.cloud.google.com/dataflow/jobs/us-central1/2023-08-31_08_10_41-5809094679400334409?project=apache-beam-testing [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 2023-08-31_08_10_41-5809094679400334409 is in state JOB_STATE_RUNNING [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:10:46.406Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-c. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:10:48.978Z: JOB_MESSAGE_BASIC: Executing operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:10:49.047Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:10:49.889Z: JOB_MESSAGE_BASIC: Finished operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:10:58.304Z: JOB_MESSAGE_BASIC: Executing operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Impulse+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/MapElements/Map/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split/ParMultiDo(Split)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key/ParMultiDo(AssignShard)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Reify+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:11:11.001Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:28.336Z: JOB_MESSAGE_BASIC: Finished operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/Impulse+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Create/MapElements/Map/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Split/ParMultiDo(Split)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Pair with random key/ParMultiDo(AssignShard)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Reify+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:28.391Z: JOB_MESSAGE_BASIC: Executing operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:29.453Z: JOB_MESSAGE_BASIC: Finished operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:29.514Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:29.647Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Create [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:29.766Z: JOB_MESSAGE_BASIC: Executing operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Read+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Read/ParMultiDo(Read)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/StripIds/ParMultiDo(StripIds)+ReadFromKafka/Remove Kafka Metadata/ParMultiDo(Anonymous)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:29.791Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:3736>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:30.112Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Impulse+assert_that/Create/FlatMap(<lambda at core.py:3736>)+assert_that/Create/Map(decode)+assert_that/Group/CoGroupByKeyImpl/Tag[0]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:32.206Z: JOB_MESSAGE_BASIC: Finished operation ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/Read+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/Read/ParMultiDo(Read)+ReadFromKafka/KafkaIO.Read/KafkaIO.Read.ReadFromKafkaViaUnbounded/Read(KafkaUnboundedSource)/StripIds/ParMultiDo(StripIds)+ReadFromKafka/Remove Kafka Metadata/ParMultiDo(Anonymous)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/CoGroupByKeyImpl/Tag[1]+assert_that/Group/CoGroupByKeyImpl/Flatten/InputIdentity+assert_that/Group/CoGroupByKeyImpl/GroupByKey/Write [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:32.261Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:32.568Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Close [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:32.624Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:33.107Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/CoGroupByKeyImpl/GroupByKey/Read+assert_that/Group/CoGroupByKeyImpl/MapTuple(collect_values)+assert_that/Group/RestoreTags+assert_that/Unkey+assert_that/Match [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:14:33.321Z: JOB_MESSAGE_BASIC: Stopping worker pool... [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:204 2023-08-31T15:16:54.933Z: JOB_MESSAGE_BASIC: Worker pool stopped. [32mINFO [0m apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:154 Job 2023-08-31_08_10_41-5809094679400334409 is in state JOB_STATE_DONE [32mPASSED[0m [33m=============================== warnings summary ===============================[0m ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/hdfs/config.py:28 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/hdfs/config.py>:28: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses from imp import load_source ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py:18 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py>:18: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html import pkg_resources ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871: 20 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871: 16 warnings <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.cloud')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350 ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350 ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350 ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2350 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2350: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(parent) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.logging')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py:2871 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/pkg_resources/__init__.py>:2871: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.iam')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages declare_namespace(pkg) ../../build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py:20 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/build/gradleenv/-1734967051/lib/python3.8/site-packages/google/rpc/__init__.py>:20: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('google.rpc')`. Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages pkg_resources.declare_namespace(__name__) apache_beam/io/external/xlang_kafkaio_it_test.py::CrossLanguageKafkaIOTest::test_hosted_kafkaio_null_key apache_beam/io/external/xlang_kafkaio_it_test.py::CrossLanguageKafkaIOTest::test_hosted_kafkaio_null_key apache_beam/io/external/xlang_kafkaio_it_test.py::CrossLanguageKafkaIOTest::test_hosted_kafkaio_populated_key apache_beam/io/external/xlang_kafkaio_it_test.py::CrossLanguageKafkaIOTest::test_hosted_kafkaio_populated_key <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/apache_beam/transforms/external.py>:679: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self._expansion_service, pipeline.options) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Xlang_IO_Dataflow/ws/src/sdks/python/pytest_ioCrossLanguage.xml> - [33m=== [32m2 passed[0m, [33m[1m17 skipped[0m, [33m[1m7149 deselected[0m, [33m[1m49 warnings[0m[33m in 1646.29s (0:27:26)[0m[33m ===[0m > Task :sdks:python:test-suites:dataflow:py38:ioCrossLanguageCleanup Stopping expansion service pid: 2125154. Skipping invalid pid: 2125155. > Task :sdks:python:test-suites:xlang:fnApiJobServerCleanup Killing process at 2099590 FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py311:ioCrossLanguagePythonUsingJava'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 38m 44s 101 actionable tasks: 71 executed, 26 from cache, 4 up-to-date Publishing build scan... https://ge.apache.org/s/osw2d2p5ix5i4 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
