See 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/405/display/redirect?page=changes>

Changes:

[Robert Bradshaw] Restore "Default to Runner v2 for Python Streaming jobs. 
(#15140)"

[stefan.istrate] Fix "too many pings" errors.

[stefan.istrate] Increase keepalive timeout to 5 minutes.

[david.prieto] [BEAM-12950] Not delete orphaned files to avoid missing events

[david.prieto] [BEAM-12950] Add Bug fix description to CHANGES.md

[david.prieto] [BEAM-12950] fix linter issues

[Robert Bradshaw] Dead letter option.

[Robert Bradshaw] Guard setup.py logic with __main__ condition.

[stefan.istrate] Fix yapf complaints.

[Robert Bradshaw] Avoid incompatible setting.

[aydar.zaynutdinov] [BEAM-12969] [Playground]

[david.prieto] [BEAN-12950] Skip unit test

[noreply] [BEAM-12909][BEAM-12849]  Add support for running spark3 nexmark 
queries

[Robert Bradshaw] Add the ability to use subprocesses with the dead letter 
queue.

[Robert Bradshaw] Support multi-output DoFns.

[noreply] Merge pull request #15510 from [BEAM-12883] Add coder for

[Robert Bradshaw] multi-output fix

[Robert Bradshaw] Add thresholding to dead letter pattern.

[Robert Bradshaw] treshold test fixes

[Robert Bradshaw] Better naming, documentation.

[noreply] [BEAM-12482] Ensure that we ignore schema update options when loading

[Kyle Weaver] Moving to 2.35.0-SNAPSHOT on master branch.

[Kyle Weaver] Add 2.35.0 section to changelog.

[noreply] Fix email links in the contact page


------------------------------------------
[...truncated 31.17 KB...]
  Using cached cffi-1.14.6-cp37-cp37m-manylinux1_x86_64.whl (402 kB)
Collecting pycparser
  Using cached pycparser-2.20-py2.py3-none-any.whl (112 kB)
Collecting fasteners>=0.14
  Using cached fasteners-0.16.3-py2.py3-none-any.whl (28 kB)
Requirement already satisfied: setuptools>=40.3.0 in 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from google-auth<3,>=1.18.0->apache-beam==2.35.0.dev0) (58.2.0)
Collecting rsa<5,>=3.1.4
  Using cached rsa-4.7.2-py3-none-any.whl (34 kB)
Collecting pyasn1-modules>=0.2.1
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting google-api-core[grpc]<3.0.0dev,>=1.29.0
  Using cached google_api_core-2.1.0-py2.py3-none-any.whl (94 kB)
Requirement already satisfied: packaging>=14.3 in 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from google-cloud-bigquery<3,>=1.6.0->apache-beam==2.35.0.dev0) (21.0)
Collecting proto-plus>=1.10.0
  Using cached proto_plus-1.19.2-py3-none-any.whl (43 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0
  Using cached google_resumable_media-2.0.3-py2.py3-none-any.whl (75 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.53.0-py2.py3-none-any.whl (198 kB)
Collecting libcst>=0.2.5
  Using cached libcst-0.3.21-py3-none-any.whl (514 kB)
Collecting grpc-google-iam-v1<0.13dev,>=0.12.3
  Using cached grpc_google_iam_v1-0.12.3-py3-none-any.whl
Collecting google-cloud-bigtable<2,>=0.31.1
  Using cached google_cloud_bigtable-1.6.1-py2.py3-none-any.whl (267 kB)
  Using cached google_cloud_bigtable-1.6.0-py2.py3-none-any.whl (267 kB)
  Using cached google_cloud_bigtable-1.5.1-py2.py3-none-any.whl (266 kB)
  Using cached google_cloud_bigtable-1.5.0-py2.py3-none-any.whl (266 kB)
  Using cached google_cloud_bigtable-1.4.0-py2.py3-none-any.whl (265 kB)
  Using cached google_cloud_bigtable-1.3.0-py2.py3-none-any.whl (259 kB)
  Using cached google_cloud_bigtable-1.2.1-py2.py3-none-any.whl (234 kB)
  Using cached google_cloud_bigtable-1.2.0-py2.py3-none-any.whl (234 kB)
  Using cached google_cloud_bigtable-1.1.0-py2.py3-none-any.whl (234 kB)
  Using cached google_cloud_bigtable-1.0.0-py2.py3-none-any.whl (232 kB)
  Using cached google_cloud_bigtable-0.34.0-py2.py3-none-any.whl (232 kB)
  Using cached google_cloud_bigtable-0.33.0-py2.py3-none-any.whl (230 kB)
Collecting grpc-google-iam-v1<0.12dev,>=0.11.4
  Using cached grpc-google-iam-v1-0.11.4.tar.gz (12 kB)
Collecting google-cloud-bigtable<2,>=0.31.1
  Using cached google_cloud_bigtable-0.32.2-py2.py3-none-any.whl (156 kB)
  Using cached google_cloud_bigtable-0.32.1-py2.py3-none-any.whl (156 kB)
  Using cached google_cloud_bigtable-0.32.0-py2.py3-none-any.whl (155 kB)
  Using cached google_cloud_bigtable-0.31.1-py2.py3-none-any.whl (154 kB)
INFO: pip is looking at multiple versions of google-cloud-bigquery-storage to 
determine which version is compatible with other requirements. This could take 
a while.
Collecting google-cloud-bigquery-storage>=2.6.3
  Using cached google_cloud_bigquery_storage-2.9.0-py2.py3-none-any.whl (170 kB)
  Using cached google_cloud_bigquery_storage-2.8.0-py2.py3-none-any.whl (131 kB)
  Using cached google_cloud_bigquery_storage-2.7.0-py2.py3-none-any.whl (125 kB)
  Using cached google_cloud_bigquery_storage-2.6.3-py2.py3-none-any.whl (125 kB)
INFO: pip is looking at multiple versions of google-api-core[grpc] to determine 
which version is compatible with other requirements. This could take a while.
Collecting google-api-core[grpc]<3.0.0dev,>=1.29.0
  Using cached google_api_core-2.0.1-py2.py3-none-any.whl (92 kB)
INFO: pip is looking at multiple versions of google-cloud-bigquery-storage to 
determine which version is compatible with other requirements. This could take 
a while.
  Using cached google_api_core-2.0.0-py2.py3-none-any.whl (92 kB)
INFO: This is taking longer than usual. You might need to provide the 
dependency resolver with stricter constraints to reduce runtime. If you want to 
abort this run, you can press Ctrl + C to do so. To improve how pip performs, 
tell us what happened here: https://pip.pypa.io/surveys/backtracking
  Using cached google_api_core-1.31.3-py2.py3-none-any.whl (93 kB)
Collecting protobuf<4,>=3.12.2
  Using cached 
protobuf-3.17.3-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.0 MB)
Collecting google-auth<3,>=1.18.0
  Using cached google_auth-1.35.0-py2.py3-none-any.whl (152 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached 
google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl 
(38 kB)
Collecting docopt
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Requirement already satisfied: pyparsing<3,>=2.4.2 in 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from httplib2<0.20.0,>=0.8->apache-beam==2.35.0.dev0) (2.4.7)
Collecting typing-inspect>=0.4.0
  Using cached typing_inspect-0.7.1-py3-none-any.whl (8.4 kB)
Collecting pbr>=0.11
  Using cached pbr-5.6.0-py2.py3-none-any.whl (111 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.0-py2.py3-none-any.whl (45 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2021.5.30-py2.py3-none-any.whl (145 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting attrs>=17.4.0
  Using cached attrs-21.2.0-py2.py3-none-any.whl (53 kB)
Collecting pluggy<1.0,>=0.12
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting wcwidth
  Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.4.0-py2.py3-none-any.whl (6.8 kB)
Requirement already satisfied: py>=1.5.0 in 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from pytest<5.0,>=4.4.0->apache-beam==2.35.0.dev0) (1.10.0)
Requirement already satisfied: importlib-metadata>=0.12 in 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from pytest<5.0,>=4.4.0->apache-beam==2.35.0.dev0) (2.1.1)
Collecting more-itertools>=4.0.0
  Using cached more_itertools-8.10.0-py3-none-any.whl (51 kB)
Requirement already satisfied: zipp>=0.5 in 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.35.0.dev0) 
(3.6.0)
Collecting execnet>=1.1
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
  Using cached pytest_forked-1.3.0-py2.py3-none-any.whl (4.7 kB)
Collecting charset-normalizer~=2.0.0
  Using cached charset_normalizer-2.0.6-py3-none-any.whl (37 kB)
Collecting idna<4,>=2.5
  Using cached idna-3.2-py3-none-any.whl (59 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
Collecting greenlet!=0.4.17
  Using cached 
greenlet-1.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (150 
kB)
Collecting deprecation
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting wrapt
  Using cached wrapt-1.13.1-cp37-cp37m-manylinux2010_x86_64.whl (79 kB)
Collecting docker
  Using cached docker-5.0.2-py2.py3-none-any.whl (145 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.2.1-py2.py3-none-any.whl (52 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: 
filename=apache_beam-2.35.0.dev0-py3-none-any.whl size=2571592 
sha256=a4cd3856c77e1bf72e44be00ced9b1e7b6bda62885177bd37272b54c61b40452
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/0b/f4/6b/a024b397e4938b6f618aad92c6194894a81dedd34e7aaa54d0
Successfully built apache-beam
Installing collected packages: pyasn1, urllib3, rsa, pyasn1-modules, protobuf, 
idna, charset-normalizer, certifi, cachetools, wcwidth, typing-extensions, 
requests, pytz, python-dateutil, pycparser, pluggy, oauthlib, mypy-extensions, 
more-itertools, jmespath, googleapis-common-protos, google-auth, attrs, 
atomicwrites, websocket-client, typing-inspect, requests-oauthlib, pyyaml, 
pytest, numpy, isodate, httplib2, grpcio-gcp, google-crc32c, google-api-core, 
docopt, cffi, botocore, wrapt, s3transfer, pytest-forked, pymongo, pydot, 
pyarrow, proto-plus, pbr, orjson, oauth2client, msrest, libcst, hdfs, 
grpc-google-iam-v1, greenlet, google-resumable-media, google-cloud-core, 
fasteners, fastavro, execnet, docker, dill, deprecation, cryptography, crcmod, 
azure-core, avro-python3, testcontainers, tenacity, sqlalchemy, requests-mock, 
pytest-xdist, pytest-timeout, pyhamcrest, psycopg2-binary, parameterized, 
pandas, mock, google-cloud-vision, google-cloud-videointelligence, 
google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, 
google-cloud-language, google-cloud-dlp, google-cloud-datastore, 
google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, 
google-apitools, freezegun, boto3, azure-storage-blob, apache-beam
  Attempting uninstall: protobuf
    Found existing installation: protobuf 3.18.1
    Uninstalling protobuf-3.18.1:
      Successfully uninstalled protobuf-3.18.1
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.35.0.dev0 atomicwrites-1.4.0 attrs-21.2.0 
avro-python3-1.9.2.1 azure-core-1.19.0 azure-storage-blob-12.9.0 boto3-1.18.56 
botocore-1.21.56 cachetools-4.2.4 certifi-2021.5.30 cffi-1.14.6 
charset-normalizer-2.0.6 crcmod-1.7 cryptography-35.0.0 deprecation-2.1.0 
dill-0.3.1.1 docker-5.0.2 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.5 
fasteners-0.16.3 freezegun-1.1.0 google-api-core-1.31.3 google-apitools-0.5.31 
google-auth-1.35.0 google-cloud-bigquery-2.28.0 
google-cloud-bigquery-storage-2.9.1 google-cloud-bigtable-1.7.0 
google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-1.0.0 
google-cloud-language-1.3.0 google-cloud-pubsub-1.7.0 
google-cloud-recommendations-ai-0.2.0 google-cloud-spanner-1.19.1 
google-cloud-videointelligence-1.16.1 google-cloud-vision-1.0.0 
google-crc32c-1.3.0 google-resumable-media-2.0.3 
googleapis-common-protos-1.53.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 
grpcio-gcp-0.2.2 hdfs-2.6.0 httplib2-0.19.1 idna-3.2 isodate-0.6.0 
jmespath-0.10.0 libcst-0.3.21 mock-2.0.0 more-itertools-8.10.0 msrest-0.6.21 
mypy-extensions-0.4.3 numpy-1.20.3 oauth2client-4.1.3 oauthlib-3.1.1 
orjson-3.6.4 pandas-1.3.3 parameterized-0.7.5 pbr-5.6.0 pluggy-0.13.1 
proto-plus-1.19.2 protobuf-3.17.3 psycopg2-binary-2.9.1 pyarrow-5.0.0 
pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.20 pydot-1.4.2 pyhamcrest-1.10.1 
pymongo-3.12.0 pytest-4.6.11 pytest-forked-1.3.0 pytest-timeout-1.4.2 
pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-5.4.1 
requests-2.26.0 requests-mock-1.9.3 requests-oauthlib-1.3.0 rsa-4.7.2 
s3transfer-0.5.0 sqlalchemy-1.4.25 tenacity-5.1.5 testcontainers-3.4.2 
typing-extensions-3.10.0.2 typing-inspect-0.7.1 urllib3-1.26.7 wcwidth-0.2.5 
websocket-client-1.2.1 wrapt-1.13.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.35.0.dev
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210920
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20210920" for Docker 
environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-gbk-3-1007100236.1633608927.524332/dataflow_python_sdk.tar...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-gbk-3-1007100236.1633608927.524332/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-gbk-3-1007100236.1633608927.524332/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-gbk-3-1007100236.1633608927.524332/pipeline.pb
 in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--iterations=1', '--fanout=1', 'shuffle_mode=appliance']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--iterations=1', '--fanout=1', 'shuffle_mode=appliance']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20211007121527525345-1193'
 createTime: '2021-10-07T12:15:28.563812Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-10-07_05_15_27-1709688089246125864'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-gbk-3-1007100236'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-10-07T12:15:28.563812Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2021-10-07_05_15_27-1709688089246125864]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2021-10-07_05_15_27-1709688089246125864
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-10-07_05_15_27-1709688089246125864?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2021-10-07_05_15_27-1709688089246125864 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:34.471Z: 
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.372Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.441Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.524Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.555Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey 0: GroupByKey 
not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.595Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.632Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.688Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.745Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.822Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.879Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Read/Map(<lambda at iobase.py:898>) into 
Read/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.932Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
 into Read/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.966Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
 into 
ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:35.999Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into 
ref_AppliedPTransform_Read-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.071Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Assign timestamps into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.132Z: 
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey 0/WriteStream into Assign 
timestamps
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.195Z: 
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey 0/MergeBuckets into GroupByKey 
0/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.239Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Ungroup 0 into GroupByKey 0/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.297Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure latency 0 into Ungroup 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.347Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Measure latency 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.514Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.563Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.597Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.633Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.682Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.714Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:15:36.750Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2021-10-07_05_15_27-1709688089246125864 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:16:02.890Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:16:21.210Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:16:50.399Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T12:16:50.463Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2021-10-07_05_15_27-1709688089246125864 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T16:00:27.604Z: 
JOB_MESSAGE_BASIC: Cancel request is committed for workflow job: 
2021-10-07_05_15_27-1709688089246125864.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T16:00:27.708Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T16:00:27.753Z: 
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T16:00:27.781Z: 
JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T16:00:27.825Z: 
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2021-10-07T16:00:27.846Z: 
JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/group_by_key_test.py";,>
 line 118, in <module>
    GroupByKeyTest().run()
  File 
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 151, in run
    self.result.wait_until_finish(duration=self.timeout_ms)
  File 
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 1634, in wait_until_finish
    'Job did not reach to a terminal state after waiting indefinitely.')
AssertionError: Job did not reach to a terminal state after waiting 
indefinitely.

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.9.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3h 47m 35s
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/nmikwgcvaxxpo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to