See <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/1044/display/redirect>
Changes: ------------------------------------------ [...truncated 39.01 KB...] Collecting certifi>=2017.4.17 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0) Using cached certifi-2022.12.7-py3-none-any.whl (155 kB) Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0) Using cached scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB) Collecting threadpoolctl>=2.0.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0) Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB) Collecting greenlet!=0.4.17 (from sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0) Using cached greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566 kB) Collecting docker>=4.0.0 (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0) Using cached docker-6.0.1-py3-none-any.whl (147 kB) Collecting wrapt (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0) Using cached wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75 kB) Collecting deprecation (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0) Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB) Collecting pymysql (from testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0) Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB) Collecting pycparser (from cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0) Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB) Collecting websocket-client>=0.32.0 (from docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0) Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB) Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB) Collecting google-crc32c<2.0dev,>=1.0 (from google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) Using cached google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (32 kB) Requirement already satisfied: zipp>=0.5 in <https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages> (from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (3.15.0) Collecting PyJWT[crypto]<3,>=1.0.0 (from msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0) Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB) Collecting portalocker<3,>=1.0 (from msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0) Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB) Collecting pyasn1>=0.1.7 (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0) Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB) Building wheels for collected packages: apache-beam Building wheel for apache-beam (setup.py): started Building wheel for apache-beam (setup.py): finished with status 'done' Created wheel for apache-beam: filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3056916 sha256=53f5d9a85109321cf796efe2670d09e5be648fb96439f5115a85d8e1f714665d Stored in directory: /home/jenkins/.cache/pip/wheels/ab/30/db/7a1d878abb34081c011192312e662d364ee443467dfb553778 Successfully built apache-beam Installing collected packages: sortedcontainers, pytz, docopt, crcmod, zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity, sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate, iniconfig, idna, greenlet, googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot, pyasn1-modules, pandas, httplib2, grpcio-status, google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer, requests_mock, pytest, oauth2client, hypothesis, hdfs, grpc-google-iam-v1, google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist, pytest-timeout, google-auth-httplib2, google-apitools, google-api-core, boto3, azure-storage-blob, apache-beam, msal, google-cloud-core, msal-extensions, google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, azure-identity Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0 azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0 boto3-1.26.117 botocore-1.29.117 cachetools-4.2.4 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.0.1 docopt-0.6.2 exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.3 fasteners-0.18 freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31 google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0 google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0 google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1 google-cloud-language-2.9.1 google-cloud-pubsub-2.16.0 google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3 google-cloud-spanner-3.31.0 google-cloud-videointelligence-2.11.1 google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.4.1 googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6 grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.21.0 hypothesis-6.72.1 idna-3.4 iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2 msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.8.10 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0 portalocker-2.7.0 proto-plus-1.22.2 psycopg2-binary-2.9.6 pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1 pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3 pyyaml-6.0 regex-2022.10.31 requests-2.28.2 requests_mock-1.10.0 rsa-4.9 s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0 sqlalchemy-1.4.47 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1 threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1 wrapt-1.15.0 zstandard-0.21.0 > Task :sdks:python:apache_beam:testing:load_tests:run INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412 INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds. INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0420200428.1682034655.134290/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0420200428.1682034655.134290/dataflow_python_sdk.tar in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0420200428.1682034655.134290/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0420200428.1682034655.134290/pipeline.pb in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job clientRequestId: '20230420235055135250-7906' createTime: '2023-04-20T23:50:56.213593Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-04-20_16_50_55-2979027938665893969' location: 'us-central1' name: 'performance-tests-psio-python-2gb0420200428' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-04-20T23:50:56.213593Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-20_16_50_55-2979027938665893969] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-20_16_50_55-2979027938665893969 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-20_16_50_55-2979027938665893969?project=apache-beam-testing INFO:apache_beam.runners.dataflow.test_dataflow_runner:Console log: INFO:apache_beam.runners.dataflow.test_dataflow_runner:https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-20_16_50_55-2979027938665893969?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-20_16_50_55-2979027938665893969 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:00.632Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:01.919Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:01.950Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.017Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.062Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.093Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.129Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.162Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.210Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.253Z: JOB_MESSAGE_DETAILED: Fusing consumer Create input/Map(<lambda at iobase.py:908>) into Create input/Impulse INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.281Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction into Create input/Map(<lambda at iobase.py:908>) INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.313Z: JOB_MESSAGE_DETAILED: Fusing consumer ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.348Z: JOB_MESSAGE_DETAILED: Fusing consumer Format to pubsub message in bytes into ref_AppliedPTransform_Create-input-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.381Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format to pubsub message in bytes INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.414Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Measure time INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.449Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.544Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.580Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.605Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.637Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.660Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.833Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.861Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:02.894Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b... INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:38.566Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:49.503Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 4 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:49.532Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 4, though goal was 5. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:51:59.286Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:52:20.103Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-20T23:52:30.640Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:00:10.291Z: JOB_MESSAGE_WARNING: The ****s of given job are going to be updated. WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2023-04-20_16_50_55-2979027938665893969 after 721 seconds INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results for test: d8df0e9fd4ce4080a1e630ee2e47e320 and timestamp: 1682035533.1921911: INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: pubsub_io_perf_write_runtime Value: 13 INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional dependencies to be installed in SDK **** container, consider using the SDK container image pre-building workflow to avoid repetitive installations. Learn more on https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild INFO:root:Default Python SDK image for environment is apache/beam_python3.7_sdk:2.48.0.dev INFO:root:Using provided Python SDK container image: gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412 INFO:root:Python SDK container image set to "gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20230412" for Docker environment INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0420200428.1682035538.169152/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0420200428.1682035538.169152/dataflow_python_sdk.tar in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0420200428.1682035538.169152/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-psio-python-2gb0420200428.1682035538.169152/pipeline.pb in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job clientRequestId: '20230421000538170090-1005' createTime: '2023-04-21T00:05:39.407434Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2023-04-20_17_05_38-17341258334423762038' location: 'us-central1' name: 'performance-tests-psio-python-2gb0420200428' projectId: 'apache-beam-testing' stageStates: [] startTime: '2023-04-21T00:05:39.407434Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2023-04-20_17_05_38-17341258334423762038] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2023-04-20_17_05_38-17341258334423762038 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-20_17_05_38-17341258334423762038?project=apache-beam-testing INFO:apache_beam.runners.dataflow.test_dataflow_runner:Console log: INFO:apache_beam.runners.dataflow.test_dataflow_runner:https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-20_17_05_38-17341258334423762038?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2023-04-20_17_05_38-17341258334423762038 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:44.696Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.111Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.138Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.203Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.284Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.313Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.376Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.440Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.489Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.531Z: JOB_MESSAGE_DETAILED: Fusing consumer Read from pubsub/Map(_from_proto_str) into Read from pubsub/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.560Z: JOB_MESSAGE_DETAILED: Fusing consumer Map(<lambda at pubsub_io_perf_test.py:168>) into Read from pubsub/Map(_from_proto_str) INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.595Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Map(<lambda at pubsub_io_perf_test.py:168>) INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.627Z: JOB_MESSAGE_DETAILED: Fusing consumer Window into Measure time INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.658Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/KeyWithVoid into Window INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.681Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/ConvertToAccumulators into Count messages/KeyWithVoid INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.704Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/GroupByKey/WriteStream into Count messages/CombinePerKey/Combine/ConvertToAccumulators INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.727Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine into Count messages/CombinePerKey/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.762Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/CombinePerKey/Combine/Extract into Count messages/CombinePerKey/Combine INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.799Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages/UnKey into Count messages/CombinePerKey/Combine/Extract INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.830Z: JOB_MESSAGE_DETAILED: Fusing consumer Convert to bytes into Count messages/UnKey INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/ToProtobuf into Convert to bytes INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.896Z: JOB_MESSAGE_DETAILED: Fusing consumer Write to Pubsub/Write into Write to Pubsub/ToProtobuf INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.943Z: JOB_MESSAGE_BASIC: Running job using Streaming Engine INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:47.987Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:48.021Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:48.052Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:48.084Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:48.277Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:48.320Z: JOB_MESSAGE_DEBUG: Starting **** pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:05:48.344Z: JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b... INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:06:02.846Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:06:27.527Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:06:27.554Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool to 3, though goal was 5. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:06:37.333Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:06:59.123Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2023-04-21T00:07:09.811Z: JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began to receive work requests. WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2023-04-20_17_05_38-17341258334423762038 after 723 seconds ERROR:apache_beam.io.gcp.tests.pubsub_matcher:Timeout after 900 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/pubsub_io_performance_1b62483e-fe44-49c5-bf4e-e9e77aa16790_read_matcher. Traceback (most recent call last): Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-20_16_50_55-2979027938665893969?project=apache-beam-testing File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-20_17_05_38-17341258334423762038?project=apache-beam-testing File "/usr/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/io/gcp/pubsub_io_perf_test.py",> line 219, in <module> PubsubReadPerfTest().run() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",> line 150, in run self.result = self.pipeline.run() File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 114, in run False if self.not_use_test_runner_api else test_runner_api)) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/pipeline.py",> line 577, in run return self.runner.run_pipeline(self, self._options) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 70, in run_pipeline hc_assert_that(self.result, pickler.loads(on_success_matcher)) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 44, in assert_that _assert_match(actual=arg1, matcher=arg2, reason=arg3) File "<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/hamcrest/core/assert_that.py",> line 60, in _assert_match raise AssertionError(description) AssertionError: Expected: (Expected 1 messages.) but: Expected 1 messages. Got 0 messages. Diffs (item, count): Expected but not in actual: dict_items([(b'2097152', 1)]) Unexpected: dict_items([]) Unexpected (with all details): [] > Task :sdks:python:apache_beam:testing:load_tests:run FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_PerformanceTests_PubsubIOIT_Python_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'> line: 63 * What went wrong: Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'. > error occurred * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 46m 16s 15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date Build scan background action failed. java.lang.IllegalArgumentException: com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590) at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557) at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230) at java.lang.reflect.WeakCache.get(WeakCache.java:127) at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419) at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719) at com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:64) at com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:59) at com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Publishing build scan... https://gradle.com/s/louwmryansi7y Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
