See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/643/display/redirect>

Changes:


------------------------------------------
[...truncated 56.44 KB...]
  Using cached PyMySQL-1.0.2-py3-none-any.whl (43 kB)
Collecting pycparser
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
  Using cached googleapis_common_protos-1.55.0-py2.py3-none-any.whl (212 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached 
google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl 
(38 kB)
Requirement already satisfied: zipp>=0.5 in 
<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.38.0.dev0) 
(3.7.0)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting websocket-client>=0.32.0
  Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: 
filename=apache_beam-2.38.0.dev0-py3-none-any.whl size=2693741 
sha256=2b0e346294f05926ec351a655674d3a05145deaa21519a7d489415dc991d828f
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, 
crcmod, certifi, wrapt, websocket-client, urllib3, typing-utils, 
typing-extensions, tenacity, rsa, pyyaml, python-dateutil, pyparsing, pymysql, 
pymongo, pyhamcrest, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, 
pbr, orjson, oauthlib, numpy, more-itertools, jmespath, isodate, idna, 
greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, 
execnet, dill, cloudpickle, charset-normalizer, cachetools, attrs, 
atomicwrites, sqlalchemy, requests, pydot, pyarrow, pluggy, pandas, overrides, 
mock, httplib2, grpcio-status, grpcio-gcp, google-resumable-media, google-auth, 
freezegun, cffi, botocore, s3transfer, requests-oauthlib, requests_mock, 
pytest, oauth2client, hdfs, grpc-google-iam-v1, google-api-core, docker, 
deprecation, cryptography, azure-core, testcontainers, pytest-timeout, 
pytest-forked, msrest, google-cloud-core, google-apitools, boto3, apache-beam, 
pytest-xdist, google-cloud-vision, google-cloud-videointelligence, 
google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, 
google-cloud-language, google-cloud-dlp, google-cloud-datastore, 
google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, 
azure-storage-blob, google-cloud-pubsublite
  Attempting uninstall: pyparsing
    Found existing installation: pyparsing 3.0.7
    Uninstalling pyparsing-3.0.7:
      Successfully uninstalled pyparsing-3.0.7
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.38.0.dev0 atomicwrites-1.4.0 attrs-21.4.0 
azure-core-1.23.0 azure-storage-blob-12.10.0 boto3-1.21.18 botocore-1.24.18 
cachetools-4.2.4 certifi-2021.10.8 cffi-1.15.0 charset-normalizer-2.0.12 
cloudpickle-2.0.0 crcmod-1.7 cryptography-36.0.1 deprecation-2.1.0 dill-0.3.1.1 
docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.4.10 fasteners-0.17.3 
freezegun-1.2.0 google-api-core-1.31.5 google-apitools-0.5.31 
google-auth-1.35.0 google-cloud-bigquery-2.34.2 
google-cloud-bigquery-storage-2.13.0 google-cloud-bigtable-1.7.0 
google-cloud-core-1.7.2 google-cloud-datastore-1.15.3 google-cloud-dlp-3.6.2 
google-cloud-language-1.3.0 google-cloud-pubsub-2.11.0 
google-cloud-pubsublite-1.4.1 google-cloud-recommendations-ai-0.2.0 
google-cloud-spanner-1.19.1 google-cloud-videointelligence-1.16.1 
google-cloud-vision-1.0.0 google-crc32c-1.3.0 google-resumable-media-2.3.2 
googleapis-common-protos-1.55.0 greenlet-1.1.2 grpc-google-iam-v1-0.12.3 
grpcio-gcp-0.2.2 grpcio-status-1.44.0 hdfs-2.6.0 httplib2-0.19.1 idna-3.3 
isodate-0.6.1 jmespath-0.10.0 mock-2.0.0 more-itertools-8.12.0 msrest-0.6.21 
numpy-1.21.5 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.6.7 overrides-6.1.0 
pandas-1.3.5 parameterized-0.7.5 pbr-5.8.1 pluggy-0.13.1 proto-plus-1.20.3 
psycopg2-binary-2.9.3 pyarrow-6.0.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 
pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-3.12.3 pymysql-1.0.2 
pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.4.0 pytest-timeout-1.4.2 
pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2021.3 pyyaml-6.0 
requests-2.27.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 
s3transfer-0.5.2 sqlalchemy-1.4.32 tenacity-5.1.5 testcontainers-3.4.2 
typing-extensions-4.1.1 typing-utils-0.1.0 urllib3-1.26.8 wcwidth-0.2.5 
websocket-client-1.3.1 wrapt-1.14.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker 
environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/pickled_main_session...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/pickled_main_session
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/dataflow-****.jar
 in 7 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273048.199908/pipeline.pb
 in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220314155048200808-5100'
 createTime: '2022-03-14T15:50:56.315646Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-14_08_50_55-745859959694178126'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0314150526'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-14T15:50:56.315646Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2022-03-14_08_50_55-745859959694178126]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2022-03-14_08_50_55-745859959694178126
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-14_08_50_55-745859959694178126?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2022-03-14_08_50_55-745859959694178126 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:04.453Z: 
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.382Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.440Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.510Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.557Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.579Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.645Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.687Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.723Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.758Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at 
iobase.py:898>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.781Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
 into Create input/Map(<lambda at iobase.py:898>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.812Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
 into 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.844Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.876Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub 
message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.910Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure 
time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:05.946Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to 
Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.040Z: 
JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.086Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.121Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.153Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.188Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.244Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.301Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:06.337Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:38.049Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:51.822Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:51:51.850Z: 
JOB_MESSAGE_DETAILED: Resized **** pool to 4, though goal was 5.  This could be 
a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:52:02.210Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:52:16.632Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T15:52:16.661Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for 
job 2022-03-14_08_50_55-745859959694178126 after 603 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results 
for test: 73ac7c06ea924418946f236e1d1d9e90 and timestamp: 1647273859.559636:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: 
pubsub_io_perf_write_runtime Value: 103
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.38.0.dev
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220208" for Docker 
environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/pickled_main_session...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/pickled_main_session
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/dataflow-****.jar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/dataflow-****.jar
 in 5 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0314150526.1647273865.914309/pipeline.pb
 in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--pubsub_namespace_prefix=pubsub_io_performance_']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--pubsub_namespace_prefix=pubsub_io_performance_']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220314160425915190-5284'
 createTime: '2022-03-14T16:04:32.428010Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-03-14_09_04_31-3446386912144349460'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0314150526'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-03-14T16:04:32.428010Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2022-03-14_09_04_31-3446386912144349460]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2022-03-14_09_04_31-3446386912144349460
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-14_09_04_31-3446386912144349460?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2022-03-14_09_04_31-3446386912144349460 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:38.866Z: 
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.088Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.118Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.172Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.262Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.280Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.350Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.432Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.463Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.489Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) 
into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.521Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at 
pubsub_io_perf_test.py:171>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.558Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at 
pubsub_io_perf_test.py:171>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.579Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.600Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.653Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count 
messages/CombinePerKey/Combine/ConvertToAccumulators into Count 
messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.692Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count 
messages/CombinePerKey/GroupByKey/WriteStream into Count 
messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.715Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into 
Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.736Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count 
messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.758Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count 
messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.789Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.821Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert 
to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.854Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to 
Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.882Z: 
JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.916Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.948Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:40.981Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:41.015Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:41.060Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:41.107Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:04:41.138Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:05:13.731Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:05:26.373Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:05:54.661Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-03-14T16:05:54.720Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for 
job 2022-03-14_09_04_31-3446386912144349460 after 601 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 
messages from 
projects/apache-beam-testing/subscriptions/pubsub_io_performance_5f1ba947-d4b1-485d-ba53-8d93cfc679c2_read_matcher.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
Traceback (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 149, in run
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-14_08_50_55-745859959694178126?project=apache-beam-testing
    self.result = self.pipeline.run()
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-03-14_09_04_31-3446386912144349460?project=apache-beam-testing
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 573, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 68, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py";,>
 line 44, in assert_that
    _assert_match(actual=arg1, matcher=arg2, reason=arg3)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py";,>
 line 60, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py";,>
 line 223, in <module>
    PubsubReadPerfTest().run()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 154, in run
    self.cleanup()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py";,>
 line 211, in cleanup
    self.sub_client.delete_subscription(self.read_sub_name)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/pubsub_v1/services/subscriber/client.py";,>
 line 960, in delete_subscription
    request = pubsub.DeleteSubscriptionRequest(request)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/proto/message.py";,>
 line 522, in __init__
    % (self.__class__.__name__, mapping,)
TypeError: Invalid constructor input for DeleteSubscriptionRequest: 
'projects/apache-beam-testing/subscriptions/pubsub_io_performance_5f1ba947-d4b1-485d-ba53-8d93cfc679c2_read'

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.3.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 43m 7s
92 actionable tasks: 57 executed, 33 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dpmklkn6ivltk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to