See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/1103/display/redirect>

Changes:


------------------------------------------
[...truncated 33.77 KB...]
  Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from 
httplib2<0.23.0,>=0.8->apache-beam==2.49.0.dev0)
  Using cached pyparsing-3.1.0-py3-none-any.whl (102 kB)
Collecting attrs>=19.2.0 (from 
hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from 
hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from 
hypothesis<=7.0.0,>5.0.0->apache-beam==2.49.0.dev0)
  Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from 
pymongo<5.0.0,>=3.8.0->apache-beam==2.49.0.dev0)
  Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0)
  Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in 
<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (1.0.0)
Requirement already satisfied: tomli>=1.0.0 in 
<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (2.0.1)
Requirement already satisfied: importlib-metadata>=0.12 in 
<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) (6.7.0)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.49.0.dev0)
  Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from 
requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached 
charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 (171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting certifi>=2017.4.17 (from 
requests<3.0.0,>=2.24.0->apache-beam==2.49.0.dev0)
  Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached 
scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from 
scikit-learn>=0.20.0->apache-beam==2.49.0.dev0)
  Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from 
sqlalchemy<2.0,>=1.3->apache-beam==2.49.0.dev0)
  Using cached 
greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 
kB)
Collecting docker>=4.0.0 (from 
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Collecting wrapt (from 
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached 
wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 (75 kB)
Collecting deprecation (from 
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from 
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached PyMySQL-1.1.0rc2-py3-none-any.whl (44 kB)
Collecting pycparser (from 
cffi>=1.12->cryptography>=36.0.0->apache-beam==2.49.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from 
docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.49.0.dev0)
  Using cached websocket_client-1.6.0-py3-none-any.whl (56 kB)
Collecting googleapis-common-protos<2.0.dev0,>=1.56.2 (from 
google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached googleapis_common_protos-1.59.1-py2.py3-none-any.whl (224 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from 
google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.49.0.dev0)
  Using cached 
google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl 
(32 kB)
Requirement already satisfied: zipp>=0.5 in 
<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.49.0.dev0) 
(3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from 
msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached PyJWT-2.7.0-py3-none-any.whl (22 kB)
Collecting portalocker<3,>=1.0 (from 
msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.49.0.dev0)
  Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from 
oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.49.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: 
filename=apache_beam-2.49.0.dev0-py3-none-any.whl size=3206455 
sha256=45a986952007a122a9c5bd87fe952cee46cc71059319fcd9fd2e1098a3ec4e60
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod, 
zstandard, wrapt, websocket-client, urllib3, threadpoolctl, tenacity, sqlparse, 
six, scipy, regex, pyyaml, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, 
pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, 
overrides, orjson, objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, 
googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, 
exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, 
certifi, sqlalchemy, scikit-learn, rsa, requests, python-dateutil, pymongo, 
pydot, pyasn1-modules, isodate, httplib2, grpcio-status, 
google-resumable-media, cffi, attrs, requests_mock, pytest, pandas, 
oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, freezegun, 
docker, cryptography, botocore, azure-core, testcontainers, s3transfer, 
pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, 
google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, 
boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, 
google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, 
google-cloud-language, google-cloud-dlp, google-cloud-datastore, 
google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, 
google-cloud-pubsublite, azure-identity
Successfully installed PyJWT-2.7.0 apache-beam-2.49.0.dev0 attrs-23.1.0 
azure-core-1.27.1 azure-identity-1.14.0b1 azure-storage-blob-12.17.0b1 
boto3-1.26.155 botocore-1.29.155 certifi-2023.5.7 cffi-1.15.1 
charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-41.0.1 
deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.3 docopt-0.6.2 
exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18 
freezegun-1.2.2 google-api-core-2.11.1 google-apitools-0.5.31 
google-auth-2.20.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.11.1 
google-cloud-bigquery-storage-2.20.0 google-cloud-bigtable-2.19.0 
google-cloud-core-2.3.2 google-cloud-datastore-2.15.2 google-cloud-dlp-3.12.1 
google-cloud-language-2.10.0 google-cloud-pubsub-2.17.1 
google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 
google-cloud-spanner-3.36.0 google-cloud-videointelligence-2.11.2 
google-cloud-vision-3.4.2 google-crc32c-1.5.0 google-resumable-media-2.5.0 
googleapis-common-protos-1.59.1 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 
grpcio-status-1.56.0rc2 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.79.0 idna-3.4 
iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 
msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.1 
overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 
proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 
pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 
pymongo-4.4.0b0 pymysql-1.1.0rc2 pyparsing-3.1.0 pytest-7.3.2 
pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3 
pyyaml-6.0 regex-2023.6.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 
s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 six-1.16.0 
sortedcontainers-2.4.0 sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 
testcontainers-3.7.1 threadpoolctl-3.1.0 urllib3-1.26.16 websocket-client-1.6.0 
wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional 
dependencies to be installed in SDK **** container, consider using the SDK 
container image pre-building workflow to avoid repetitive installations. Learn 
more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for 
Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0618215359.1687132250.650749/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0618215359.1687132250.650749/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0618215359.1687132250.650749/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0618215359.1687132250.650749/pipeline.pb
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230618235050651661-9485'
 createTime: '2023-06-18T23:50:51.568691Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-18_16_50_51-3623490517627241908'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0618215359'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-18T23:50:51.568691Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2023-06-18_16_50_51-3623490517627241908]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2023-06-18_16_50_51-3623490517627241908
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-18_16_50_51-3623490517627241908?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.test_dataflow_runner:Console log: 
INFO:apache_beam.runners.dataflow.test_dataflow_runner:https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-18_16_50_51-3623490517627241908?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2023-06-18_16_50_51-3623490517627241908 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:56.099Z: 
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.258Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.285Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.349Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.383Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.414Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.463Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.493Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.526Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.561Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at 
iobase.py:908>) into Create input/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.596Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
 into Create input/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.631Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
 into 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.658Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into 
ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.691Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub 
message in bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.716Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure 
time
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.749Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to 
Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.858Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.887Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.950Z: 
JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:57.977Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:58.004Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:58.183Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:58.216Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:50:58.242Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:51:10.954Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:51:39.326Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:52:12.043Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:52:22.887Z: 
JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began 
to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-18T23:55:12.572Z: 
JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for 
job 2023-06-18_16_50_51-3623490517627241908 after 724 seconds
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results 
for test: ebdcc9d2c46a43a6a49d496722698c8b and timestamp: 1687133130.5178692:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: 
pubsub_io_perf_write_runtime Value: 91
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional 
dependencies to be installed in SDK **** container, consider using the SDK 
container image pre-building workflow to avoid repetitive installations. Learn 
more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for 
Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0618215359.1687133135.041877/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0618215359.1687133135.041877/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0618215359.1687133135.041877/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0618215359.1687133135.041877/pipeline.pb
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20230619000535042772-5738'
 createTime: '2023-06-19T00:05:36.086838Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-06-18_17_05_35-14362659400383312531'
 location: 'us-central1'
 name: 'performance-tests-psio-python-2gb0618215359'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-06-19T00:05:36.086838Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2023-06-18_17_05_35-14362659400383312531]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2023-06-18_17_05_35-14362659400383312531
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-18_17_05_35-14362659400383312531?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.test_dataflow_runner:Console log: 
INFO:apache_beam.runners.dataflow.test_dataflow_runner:https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-18_17_05_35-14362659400383312531?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2023-06-18_17_05_35-14362659400383312531 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:39.603Z: 
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.183Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.230Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.299Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.370Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.404Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.472Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.536Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.580Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.605Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) 
into Read from pubsub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.635Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at 
pubsub_io_perf_test.py:168>) into Read from pubsub/Map(_from_proto_str)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.668Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at 
pubsub_io_perf_test.py:168>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.700Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.733Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.765Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count 
messages/CombinePerKey/Combine/ConvertToAccumulators into Count 
messages/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.799Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count 
messages/CombinePerKey/GroupByKey/WriteStream into Count 
messages/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.831Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into 
Count messages/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.867Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count 
messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.896Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count 
messages/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.921Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.941Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert 
to bytes
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:42.974Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to 
Pubsub/ToProtobuf
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:43.013Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:43.037Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:43.071Z: 
JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:43.107Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:43.127Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:43.319Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:43.343Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:05:43.409Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:06:09.124Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:06:26.040Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:06:56.618Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-06-19T00:07:03.929Z: 
JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began 
to receive work requests.
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for 
job 2023-06-18_17_05_35-14362659400383312531 after 724 seconds
ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 
messages from 
projects/apache-beam-testing/subscriptions/pubsub_io_performance_505595b0-8d46-4ed9-b8cc-6b4843571fc7_read_matcher.
Traceback (most recent call last):
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-18_16_50_51-3623490517627241908?project=apache-beam-testing
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
Worker logs: 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-06-18_17_05_35-14362659400383312531?project=apache-beam-testing
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py";,>
 line 219, in <module>
    PubsubReadPerfTest().run()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 150, in run
    self.result = self.pipeline.run()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py";,>
 line 114, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 577, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 70, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py";,>
 line 58, in assert_that
    _assert_match(actual=actual_or_assertion, matcher=matcher, reason=reason)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py";,>
 line 73, in _assert_match
    raise AssertionError(description)
AssertionError: 
Expected: (Expected 1 messages.)
     but: Expected 1 messages. Got 0 messages. Diffs (item, count):
  Expected but not in actual: dict_items([(b'2097152', 1)])
  Unexpected: dict_items([])
  Unexpected (with all details): []


> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 63

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 45m 53s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/nju75m3lkdcgw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to