See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Dataflow/41/display/redirect?page=changes>
Changes:
[noreply] Update 2.36.0 blog post to mention ARM64 support
[noreply] Minor: Disable checker framework in nightly snapshot (#16829)
[noreply] [BEAM-13860] Make `DoFn.infer_output_type` return element type
(#16788)
------------------------------------------
[...truncated 44.38 KB...]
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/io/gcp/bigquery_file_loads.py:1130
apache_beam/io/gcp/bigquery_file_loads.py:1130
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py:1130:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/io/gcp/tests/utils.py:100
apache_beam/io/gcp/tests/utils.py:100
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python/apache_beam/io/gcp/tests/utils.py:100:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/io/gcp/tests/utils.py:63
apache_beam/io/gcp/tests/utils.py:63
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python/apache_beam/io/gcp/tests/utils.py:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/dataframe/flight_delays.py:47
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py:47:
FutureWarning: Dropping of nuisance columns in DataFrame reductions (with
'numeric_only=None') is deprecated; in a future version this will raise
TypeError. Select only valid columns before calling the reduction.
return airline_df[at_top_airports].mean()
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python/pytest_postCommitIT-df-py38-xdist.xml
-
[33m[1m============= 17 passed, 1 skipped, 48 warnings in 1535.39 seconds
=============[0m
>>> RUNNING integration tests with pipeline options:
>>> --runner=TestDataflowRunner --project=apache-beam-testing
>>> --region=us-central1
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
>>> --sdk_location=/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python/build/apache-beam.tar.gz
>>> --requirements_file=postcommit_requirements.txt --num_workers=1
>>> --sleep_secs=20 --experiments=use_runner_v2
>>> --experiments=shuffle_mode=appliance --experiments=beam_fn_api
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> pytest options: --capture=no --timeout=4500 --color=yes
>>> --log-cli-level=INFO
>>> collect markers: -m=examples_postcommit and no_xdist
[1m============================= test session starts
==============================[0m
platform linux -- Python 3.8.10, pytest-4.6.11, py-1.11.0, pluggy-0.13.1
rootdir:
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python,
inifile: pytest.ini
plugins: xdist-1.34.0, timeout-1.4.2, forked-1.4.0, requests-mock-1.9.3
timeout: 4500.0s
timeout method: signal
timeout func_only: False
[1m----------------------------- live log collection
------------------------------[0m
[33mWARNING [0m root:avroio_test.py:51 python-snappy is not installed; some
tests will be skipped.
[33mWARNING [0m root:tfrecordio_test.py:55 Tensorflow is not installed, so
skipping some tests.
[33mWARNING [0m
apache_beam.runners.interactive.interactive_environment:interactive_environment.py:187
Dependencies required for Interactive Beam PCollection visualization are not
available, please use: `pip install apache-beam[interactive]` to install
necessary dependencies to enable all data visualization features.
[33mWARNING [0m
apache_beam.runners.interactive.interactive_environment:interactive_environment.py:196
You cannot use Interactive Beam features when you are not in an interactive
environment such as a Jupyter notebook or ipython terminal.
[33mWARNING [0m root:environments.py:371 Make sure that locally built Python
SDK docker image has Python 3.8 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for
environment is apache/beam_python3.8_sdk:2.38.0.dev
collected 5211 items / 5204 deselected / 7 selected
apache_beam/examples/complete/autocomplete_test.py::AutocompleteTest::test_autocomplete_output_files_on_small_input
[1m-------------------------------- live log call
---------------------------------[0m
[32mINFO [0m root:autocomplete_test.py:65 Creating temp file:
/tmp/tmpr05t8ol3/input.txt
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:754
Executing command:
['/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/build/gradleenv/-1734967051/bin/python3.8',
'-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r',
'/tmp/tmpsca3nloe/tmp_requirements.txt', '--exists-action', 'i', '--no-deps',
'--implementation', 'cp', '--abi', 'cp38', '--platform', 'manylinux2014_x86_64']
[32mINFO [0m apache_beam.runners.portability.stager:stager.py:325 Copying
Beam SDK
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python_Examples_Dataflow/src/sdks/python/build/apache-beam.tar.gz"
to staging location.
[33mWARNING [0m root:environments.py:371 Make sure that locally built Python
SDK docker image has Python 3.8 interpreter.
[32mINFO [0m root:environments.py:380 Default Python SDK image for
environment is apache/beam_python3.8_sdk:2.38.0.dev
[32mINFO [0m root:environments.py:295 Using provided Python SDK container
image: gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220208
[32mINFO [0m root:environments.py:302 Python SDK container image set to
"gcr.io/cloud-dataflow/v1beta3/python38-fnapi:beam-master-20220208" for Docker
environment
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function pack_combiners at 0x7fa6b72c6ee0>
====================
[32mINFO [0m
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function sort_stages at 0x7fa6b72cb700>
====================
[32mINFO [0m apache_beam.internal.gcp.auth:auth.py:105 Setting socket
default timeout to 60 seconds.
[32mINFO [0m apache_beam.internal.gcp.auth:auth.py:107 socket default
timeout is 60.0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/requirements.txt...
[32mINFO [0m oauth2client.transport:transport.py:157 Attempting refresh to
obtain initial access_token
[32mINFO [0m oauth2client.transport:transport.py:157 Attempting refresh to
obtain initial access_token
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/requirements.txt
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/pickled_main_session...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/pickled_main_session
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/pbr-5.8.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/pbr-5.8.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/pbr-5.8.1.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/pbr-5.8.1.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/mock-2.0.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/mock-2.0.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/six-1.16.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/six-1.16.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/soupsieve-2.3.1.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/soupsieve-2.3.1.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/PyHamcrest-1.10.1.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/PyHamcrest-1.10.1.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/parameterized-0.7.5.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/parameterized-0.7.5.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/beautifulsoup4-4.10.0.tar.gz...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/beautifulsoup4-4.10.0.tar.gz
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/mock-2.0.0-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/mock-2.0.0-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/seaborn-0.11.2-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/seaborn-0.11.2-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/PyHamcrest-1.10.1-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/PyHamcrest-1.10.1-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/beautifulsoup4-4.10.0-py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/beautifulsoup4-4.10.0-py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/parameterized-0.7.5-py2.py3-none-any.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/parameterized-0.7.5-py2.py3-none-any.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/matplotlib-3.5.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/matplotlib-3.5.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/matplotlib-3.5.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/matplotlib-3.5.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/dataflow_python_sdk.tar...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/dataflow_python_sdk.tar
in 0 seconds.
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:706 Starting GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/pipeline.pb...
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:722 Completed GCS
upload to
gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0214183020-153511-zxz6xnth.1644863420.153673/pipeline.pb
in 0 seconds.
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309
Discarding unparseable args: ['--sleep_secs=20',
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[33mWARNING [0m apache_beam.options.pipeline_options:pipeline_options.py:309
Discarding unparseable args: ['--sleep_secs=20',
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:886 Create job:
<Job
clientRequestId: '20220214183020154825-5768'
createTime: '2022-02-14T18:30:26.308941Z'
currentStateTime: '1970-01-01T00:00:00Z'
id:
'2022-02-14_10_30_24-318769559491137115'
location: 'us-central1'
name: 'beamapp-jenkins-0214183020-153511-zxz6xnth'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2022-02-14T18:30:26.308941Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:888 Created job
with id: [2022-02-14_10_30_24-318769559491137115]
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:889 Submitted job:
2022-02-14_10_30_24-318769559491137115
[32mINFO [0m
apache_beam.runners.dataflow.internal.apiclient:apiclient.py:890 To access the
Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-14_10_30_24-318769559491137115?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-14_10_30_24-318769559491137115?project=apache-beam-testing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:191 Job
2022-02-14_10_30_24-318769559491137115 is in state JOB_STATE_RUNNING
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:27.934Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2022-02-14_10_30_24-318769559491137115. The number of workers will be between 1
and 1000.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:28.176Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically
enabled for job 2022-02-14_10_30_24-318769559491137115.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:31.493Z: JOB_MESSAGE_BASIC: Worker configuration:
e2-standard-2 in us-central1-b.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.117Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.149Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.214Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey
operations into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.249Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.363Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations
into optimizable parts.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.388Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner
information.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.420Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read,
Write, and Flatten operations
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.463Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/InitializeWrite into
write/Write/WriteImpl/DoOnce/Map(decode)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.486Z: JOB_MESSAGE_DETAILED: Fusing consumer
read/Read/Map(<lambda at iobase.py:898>) into read/Read/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.518Z: JOB_MESSAGE_DETAILED: Fusing consumer
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
into read/Read/Map(<lambda at iobase.py:898>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.551Z: JOB_MESSAGE_DETAILED: Fusing consumer
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
into
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.578Z: JOB_MESSAGE_DETAILED: Fusing consumer split into
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.602Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/PerElement/PerElement:PairWithVoid into split
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.624Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine/Partial
into TopPerPrefix/PerElement/PerElement:PairWithVoid
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.656Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Write into
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine/Partial
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.692Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine into
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Read
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.712Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine/Extract into
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.734Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/FlatMap(extract_prefixes) into
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.756Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine/Partial
into TopPerPrefix/FlatMap(extract_prefixes)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.787Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Write
into
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine/Partial
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.808Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine
into
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Read
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.841Z: JOB_MESSAGE_DETAILED: Fusing consumer
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine/Extract
into
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.883Z: JOB_MESSAGE_DETAILED: Fusing consumer format into
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.919Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/WindowInto(WindowIntoFn) into format
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.951Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3229>) into
write/Write/WriteImpl/DoOnce/Impulse
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:32.983Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/DoOnce/Map(decode) into
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:3229>)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.015Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/WriteBundles into
write/Write/WriteImpl/WindowInto(WindowIntoFn)
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.047Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.080Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/Pair
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.115Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/Read
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.156Z: JOB_MESSAGE_DEBUG: Workflow config is missing a
default resource spec.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.189Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and
teardown to workflow graph.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.223Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop
steps.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.256Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.408Z: JOB_MESSAGE_DEBUG: Executing wait step start49
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.483Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
at
core.py:3229>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.505Z: JOB_MESSAGE_BASIC: Executing operation
read/Read/Impulse+read/Read/Map(<lambda at
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.532Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.538Z: JOB_MESSAGE_BASIC: Executing operation
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.554Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.566Z: JOB_MESSAGE_BASIC: Executing operation
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.591Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.611Z: JOB_MESSAGE_BASIC: Finished operation
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.611Z: JOB_MESSAGE_BASIC: Finished operation
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.645Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/GroupByKey/Create
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.666Z: JOB_MESSAGE_DEBUG: Value
"TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Session"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.699Z: JOB_MESSAGE_DEBUG: Value
"TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Session"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:33.732Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/GroupByKey/Session" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:30:59.064Z: JOB_MESSAGE_BASIC: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:31:17.094Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running stage(s).
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:31:32.391Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:31:32.417Z: JOB_MESSAGE_DETAILED: Workers have started
successfully.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.548Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
at
core.py:3229>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.552Z: JOB_MESSAGE_BASIC: Finished operation
read/Read/Impulse+read/Read/Map(<lambda at
iobase.py:898>)+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/PairWithRestriction+ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/SplitWithSizing
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.600Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/DoOnce/Map(decode).None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.626Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/InitializeWrite.None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.649Z: JOB_MESSAGE_DEBUG: Value
"ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7-split-with-sizing-out3"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.673Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.698Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.717Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.722Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.745Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.754Z: JOB_MESSAGE_BASIC: Executing operation
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+split+TopPerPrefix/PerElement/PerElement:PairWithVoid+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine/Partial+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.758Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.783Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input0-write/Write/WriteImpl/FinalizeWrite.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.812Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/WriteBundles/View-python_side_input0-write/Write/WriteImpl/WriteBundles.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:29.866Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/View-python_side_input0-write/Write/WriteImpl/PreFinalize.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:35.653Z: JOB_MESSAGE_BASIC: Finished operation
ref_AppliedPTransform_read-Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_7/ProcessElementAndRestrictionWithSizing+split+TopPerPrefix/PerElement/PerElement:PairWithVoid+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine/Partial+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:36Z: JOB_MESSAGE_BASIC: Executing operation
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:36.098Z: JOB_MESSAGE_BASIC: Finished operation
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:36.183Z: JOB_MESSAGE_BASIC: Executing operation
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Read+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine/Extract+TopPerPrefix/FlatMap(extract_prefixes)+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine/Partial+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:37.279Z: JOB_MESSAGE_BASIC: Finished operation
TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/GroupByKey/Read+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine+TopPerPrefix/PerElement/CombinePerKey(CountCombineFn)/Combine/Extract+TopPerPrefix/FlatMap(extract_prefixes)+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine/Partial+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:37.357Z: JOB_MESSAGE_BASIC: Executing operation
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:37.397Z: JOB_MESSAGE_BASIC: Finished operation
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:37.482Z: JOB_MESSAGE_BASIC: Executing operation
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Read+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine/Extract+format+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:38.611Z: JOB_MESSAGE_BASIC: Finished operation
TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/GroupByKey/Read+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine+TopPerPrefix/LargestPerKey(5)/TopPerKey(5)/CombinePerKey(TopCombineFn)/Combine/Extract+format+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/GroupByKey/Write
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:38.679Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:38.722Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/GroupByKey/Close
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:38.787Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:41.018Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/GroupByKey/Read+write/Write/WriteImpl/Extract
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:41.079Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/Extract.None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:41.141Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:41.171Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:41.189Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:41.217Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:41.237Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input1-write/Write/WriteImpl/FinalizeWrite.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:41.274Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/View-python_side_input1-write/Write/WriteImpl/PreFinalize.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:41.339Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:44.220Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/PreFinalize
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:44.287Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize.None" materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:44.357Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:44.396Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:44.465Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/View-python_side_input2-write/Write/WriteImpl/FinalizeWrite.out"
materialized.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:44.531Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:46.372Z: JOB_MESSAGE_BASIC: Finished operation
write/Write/WriteImpl/FinalizeWrite
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:46.460Z: JOB_MESSAGE_DEBUG: Executing success step success47
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:46.526Z: JOB_MESSAGE_DETAILED: Cleaning up.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:46.617Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
[32mINFO [0m
apache_beam.runners.dataflow.dataflow_runner:dataflow_runner.py:236
2022-02-14T18:38:46.652Z: JOB_MESSAGE_BASIC: Stopping worker pool...
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-11' is disconnected.
at
hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:216)
at
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
at com.sun.proxy.$Proxy133.isAlive(Unknown Source)
at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1211)
at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1203)
at hudson.Launcher$ProcStarter.join(Launcher.java:523)
at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:803)
at hudson.model.Build$BuildExecution.build(Build.java:197)
at hudson.model.Build$BuildExecution.doRun(Build.java:163)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:513)
at hudson.model.Run.execute(Run.java:1906)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: java.io.IOException: Pipe closed after 0 cycles
at
org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:118)
at
org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:101)
at
hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:93)
at
hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:74)
at
hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:104)
at
hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
at
hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
at
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-11 is offline; cannot locate jdk_1.8_latest
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]