See
<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/989/display/redirect>
Changes:
------------------------------------------
[...truncated 34.95 KB...]
Downloading botocore-1.29.130-py3-none-any.whl (10.7 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.7/10.7 MB 80.7 MB/s eta 0:00:00
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.9->apache-beam==2.48.0.dev0)
Using cached jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from
boto3<2,>=1.9->apache-beam==2.48.0.dev0)
Using cached s3transfer-0.6.1-py3-none-any.whl (79 kB)
Collecting cffi>=1.12 (from cryptography>=36.0.0->apache-beam==2.48.0.dev0)
Using cached
cffi-1.15.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB)
Collecting oauth2client>=1.4.12 (from
google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Collecting pyasn1-modules>=0.2.1 (from
google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB)
Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.18.0->apache-beam==2.48.0.dev0)
Using cached rsa-4.9-py3-none-any.whl (34 kB)
Collecting
google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5 (from
google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
Using cached google_api_core-2.11.0-py3-none-any.whl (120 kB)
Collecting google-resumable-media<3.0dev,>=0.6.0 (from
google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
Using cached google_resumable_media-2.5.0-py2.py3-none-any.whl (77 kB)
Requirement already satisfied: packaging>=20.0.0 in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0) (23.1)
Collecting grpc-google-iam-v1<1.0.0dev,>=0.12.4 (from
google-cloud-bigtable<3,>=2.0.0->apache-beam==2.48.0.dev0)
Using cached grpc_google_iam_v1-0.12.6-py2.py3-none-any.whl (26 kB)
Collecting grpcio-status>=1.33.2 (from
google-cloud-pubsub<3,>=2.1.0->apache-beam==2.48.0.dev0)
Using cached grpcio_status-1.54.0-py3-none-any.whl (5.1 kB)
Collecting overrides<7.0.0,>=6.0.1 (from
google-cloud-pubsublite<2,>=1.2.0->apache-beam==2.48.0.dev0)
Using cached overrides-6.5.0-py3-none-any.whl (17 kB)
Collecting sqlparse>=0.3.0 (from
google-cloud-spanner<4,>=3.0.0->apache-beam==2.48.0.dev0)
Using cached sqlparse-0.4.4-py3-none-any.whl (41 kB)
Collecting docopt (from hdfs<3.0.0,>=2.1.0->apache-beam==2.48.0.dev0)
Using cached docopt-0.6.2-py2.py3-none-any.whl
Collecting pyparsing!=3.0.0,!=3.0.1,!=3.0.2,!=3.0.3,<4,>=2.4.2 (from
httplib2<0.23.0,>=0.8->apache-beam==2.48.0.dev0)
Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting attrs>=19.2.0 (from
hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
Using cached attrs-23.1.0-py3-none-any.whl (61 kB)
Collecting sortedcontainers<3.0.0,>=2.1.0 (from
hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB)
Collecting exceptiongroup>=1.0.0 (from
hypothesis<=7.0.0,>5.0.0->apache-beam==2.48.0.dev0)
Using cached exceptiongroup-1.1.1-py3-none-any.whl (14 kB)
Collecting dnspython<3.0.0,>=1.16.0 (from
pymongo<5.0.0,>=3.8.0->apache-beam==2.48.0.dev0)
Using cached dnspython-2.3.0-py3-none-any.whl (283 kB)
Collecting iniconfig (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
Using cached iniconfig-2.0.0-py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<2.0,>=0.12 in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (1.0.0)
Collecting tomli>=1.0.0 (from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
Using cached tomli-2.0.1-py3-none-any.whl (12 kB)
Requirement already satisfied: importlib-metadata>=0.12 in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0) (2.1.3)
Collecting execnet>=1.1 (from pytest-xdist<4,>=2.5.0->apache-beam==2.48.0.dev0)
Using cached execnet-1.9.0-py2.py3-none-any.whl (39 kB)
Collecting charset-normalizer<4,>=2 (from
requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
Using cached
charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
(171 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1 (from
requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
Using cached urllib3-2.0.2-py3-none-any.whl (123 kB)
Collecting certifi>=2017.4.17 (from
requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
Using cached certifi-2023.5.7-py3-none-any.whl (156 kB)
Collecting scipy>=1.1.0 (from scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
Using cached
scipy-1.7.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.1 MB)
Collecting threadpoolctl>=2.0.0 (from
scikit-learn>=0.20.0->apache-beam==2.48.0.dev0)
Using cached threadpoolctl-3.1.0-py3-none-any.whl (14 kB)
Collecting greenlet!=0.4.17 (from
sqlalchemy<2.0,>=1.3->apache-beam==2.48.0.dev0)
Using cached
greenlet-2.0.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (566
kB)
Collecting docker>=4.0.0 (from
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
Using cached docker-6.1.1-py3-none-any.whl (147 kB)
Collecting wrapt (from
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
Using cached
wrapt-1.15.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
(75 kB)
Collecting deprecation (from
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
Using cached PyMySQL-1.0.3-py3-none-any.whl (43 kB)
Collecting urllib3<3,>=1.21.1 (from
requests<3.0.0,>=2.24.0->apache-beam==2.48.0.dev0)
Using cached urllib3-1.26.15-py2.py3-none-any.whl (140 kB)
Collecting pycparser (from
cffi>=1.12->cryptography>=36.0.0->apache-beam==2.48.0.dev0)
Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from
docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.48.0.dev0)
Using cached websocket_client-1.5.1-py3-none-any.whl (55 kB)
Collecting googleapis-common-protos<2.0dev,>=1.56.2 (from
google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
Using cached googleapis_common_protos-1.59.0-py2.py3-none-any.whl (223 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from
google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.48.0.dev0)
Using cached
google_crc32c-1.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
(32 kB)
Requirement already satisfied: zipp>=0.5 in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from importlib-metadata>=0.12->pytest<8.0,>=7.1.2->apache-beam==2.48.0.dev0)
(3.15.0)
Collecting PyJWT[crypto]<3,>=1.0.0 (from
msal<2.0.0,>=1.12.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
Using cached PyJWT-2.6.0-py3-none-any.whl (20 kB)
Collecting portalocker<3,>=1.0 (from
msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.48.0.dev0)
Using cached portalocker-2.7.0-py2.py3-none-any.whl (15 kB)
Collecting pyasn1>=0.1.7 (from
oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.48.0.dev0)
Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Building wheels for collected packages: apache-beam
Building wheel for apache-beam (setup.py): started
Building wheel for apache-beam (setup.py): finished with status 'done'
Created wheel for apache-beam:
filename=apache_beam-2.48.0.dev0-py3-none-any.whl size=3068516
sha256=36b9cf49eb8b09ae7e6b3b20554a4fb1d7ec9853b2b6b70744fccb842929b13e
Stored in directory:
/home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, docopt, crcmod,
zstandard, wrapt, websocket-client, urllib3, tomli, threadpoolctl, tenacity,
sqlparse, scipy, regex, pyyaml, python-dateutil, pyparsing, pymysql, PyJWT,
pyhamcrest, pycparser, pyasn1, pyarrow, psycopg2-binary, protobuf, portalocker,
parameterized, overrides, orjson, objsize, mock, joblib, jmespath, isodate,
iniconfig, idna, greenlet, google-crc32c, fasteners, fastavro, execnet,
exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer,
certifi, cachetools, sqlalchemy, scikit-learn, rsa, requests, pymongo, pydot,
pyasn1-modules, proto-plus, pandas, httplib2, googleapis-common-protos,
google-resumable-media, freezegun, cffi, botocore, attrs, s3transfer,
requests_mock, pytest, oauth2client, hypothesis, hdfs, grpcio-status,
google-auth, docker, cryptography, azure-core, testcontainers, pytest-xdist,
pytest-timeout, grpc-google-iam-v1, google-auth-httplib2, google-apitools,
google-api-core, boto3, azure-storage-blob, apache-beam, msal,
google-cloud-core, msal-extensions, google-cloud-vision,
google-cloud-videointelligence, google-cloud-spanner,
google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language,
google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable,
google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite,
azure-identity
Attempting uninstall: protobuf
Found existing installation: protobuf 4.23.0
Uninstalling protobuf-4.23.0:
Successfully uninstalled protobuf-4.23.0
Successfully installed PyJWT-2.6.0 apache-beam-2.48.0.dev0 attrs-23.1.0
azure-core-1.26.4 azure-identity-1.12.0 azure-storage-blob-12.16.0
boto3-1.26.130 botocore-1.29.130 cachetools-5.3.0 certifi-2023.5.7 cffi-1.15.1
charset-normalizer-3.1.0 cloudpickle-2.2.1 crcmod-1.7 cryptography-40.0.2
deprecation-2.1.0 dill-0.3.1.1 dnspython-2.3.0 docker-6.1.1 docopt-0.6.2
exceptiongroup-1.1.1 execnet-1.9.0 fastavro-1.7.4 fasteners-0.18
freezegun-1.2.2 google-api-core-2.11.0 google-apitools-0.5.31
google-auth-2.17.3 google-auth-httplib2-0.1.0 google-cloud-bigquery-3.10.0
google-cloud-bigquery-storage-2.19.1 google-cloud-bigtable-2.17.0
google-cloud-core-2.3.2 google-cloud-datastore-2.15.1 google-cloud-dlp-3.12.1
google-cloud-language-2.9.1 google-cloud-pubsub-2.16.1
google-cloud-pubsublite-1.7.0 google-cloud-recommendations-ai-0.10.3
google-cloud-spanner-3.33.0 google-cloud-videointelligence-2.11.1
google-cloud-vision-3.4.1 google-crc32c-1.5.0 google-resumable-media-2.5.0
googleapis-common-protos-1.59.0 greenlet-2.0.2 grpc-google-iam-v1-0.12.6
grpcio-status-1.54.0 hdfs-2.7.0 httplib2-0.22.0 hypothesis-6.75.2 idna-3.4
iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.2.0 mock-5.0.2
msal-1.22.0 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1
orjson-3.8.12 overrides-6.5.0 pandas-1.3.5 parameterized-0.9.0
portalocker-2.7.0 proto-plus-1.22.2 protobuf-4.22.4 psycopg2-binary-2.9.6
pyarrow-11.0.0 pyasn1-0.5.0 pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2
pyhamcrest-2.0.4 pymongo-4.3.3 pymysql-1.0.3 pyparsing-3.0.9 pytest-7.3.1
pytest-timeout-2.1.0 pytest-xdist-3.2.1 python-dateutil-2.8.2 pytz-2023.3
pyyaml-6.0 regex-2023.5.5 requests-2.30.0 requests_mock-1.10.0 rsa-4.9
s3transfer-0.6.1 scikit-learn-1.0.2 scipy-1.7.3 sortedcontainers-2.4.0
sqlalchemy-1.4.48 sqlparse-0.4.4 tenacity-8.2.2 testcontainers-3.7.1
threadpoolctl-3.1.0 tomli-2.0.1 urllib3-1.26.15 websocket-client-1.5.1
wrapt-1.15.0 zstandard-0.21.0
> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz">
to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional
dependencies to be installed in SDK **** container, consider using the SDK
container image pre-building workflow to avoid repetitive installations. Learn
more on
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is
apache/beam_python3.7_sdk:2.48.0.dev
INFO:root:Using provided Python SDK container image:
gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422
INFO:root:Python SDK container image set to
"gcr.io/cloud-dataflow/v1beta3/beam_python3.7_sdk:beam-master-20230422" for
Docker environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the
temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to
gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683644949.147364/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to
gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683644949.147364/dataflow_python_sdk.tar
in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to
gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683644949.147364/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to
gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0507185346.1683644949.147364/pipeline.pb
in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
clientRequestId: '20230509150909148275-7901'
createTime: '2023-05-09T15:09:10.442826Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2023-05-09_08_09_09-2319503586605036788'
location: 'us-central1'
name: 'load-tests-python-dataflow-streaming-combine-1-0507185346'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2023-05-09T15:09:10.442826Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id:
[2023-05-09_08_09_09-2319503586605036788]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job:
2023-05-09_08_09_09-2319503586605036788
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-09_08_09_09-2319503586605036788?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2023-05-09_08_09_09-2319503586605036788 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:20.215Z:
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:26.648Z:
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:31.671Z:
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:32.988Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.051Z:
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.070Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.121Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.176Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.209Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.231Z:
JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at
iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.254Z:
JOB_MESSAGE_DETAILED: Fusing consumer
ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.286Z:
JOB_MESSAGE_DETAILED: Fusing consumer
ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
into
ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.318Z:
JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into
ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.349Z:
JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/KeyWithVoid into
Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.376Z:
JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top
0/CombinePerKey/Combine/ConvertToAccumulators into Combine with Top
0/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.405Z:
JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top
0/CombinePerKey/GroupByKey/WriteStream into Combine with Top
0/CombinePerKey/Combine/ConvertToAccumulators
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.432Z:
JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/CombinePerKey/Combine
into Combine with Top 0/CombinePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.458Z:
JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top
0/CombinePerKey/Combine/Extract into Combine with Top 0/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.486Z:
JOB_MESSAGE_DETAILED: Fusing consumer Combine with Top 0/UnKey into Combine
with Top 0/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.507Z:
JOB_MESSAGE_DETAILED: Fusing consumer Consume 0 into Combine with Top 0/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.533Z:
JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Consume 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.631Z:
JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.664Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.700Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.734Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.767Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.923Z:
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.959Z:
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:33.991Z:
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2023-05-09_08_09_09-2319503586605036788 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:09:36.181Z:
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric
descriptors, so new user metrics of the form custom.googleapis.com/* will not
be created. However, all user metrics are also available in the metric
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics,
you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:10:13.538Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:10:45.353Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:10:57.974Z:
JOB_MESSAGE_DETAILED: All ****s have finished the startup processes and began
to receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T15:30:43.484Z:
JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:10:26.895Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:13:31.873Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:15:33.103Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:38:30.637Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:44:35.462Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:47:32.758Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T16:52:35.191Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T17:00:43.485Z:
JOB_MESSAGE_WARNING: The ****s of given job are going to be updated.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T17:41:43.173Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T17:45:44.881Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:03:34.286Z:
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:03:41.564Z:
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric
descriptors, so new user metrics of the form custom.googleapis.com/* will not
be created. However, all user metrics are also available in the metric
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics,
you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:15:57.662Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:23:58.862Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T18:50:00.112Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T19:01:03.262Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T19:40:54.589Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T19:56:14.654Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2023-05-09_08_09_09-2319503586605036788 is in state JOB_STATE_CANCELLING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.103Z:
JOB_MESSAGE_BASIC: Cancel request is committed for workflow job:
2023-05-09_08_09_09-2319503586605036788.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.128Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.167Z:
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.188Z:
JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.217Z:
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-05-09T20:01:50.232Z:
JOB_MESSAGE_BASIC: Stopping **** pool...
Traceback (most recent call last):
File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/combine_test.py",>
line 129, in <module>
CombineTest().run()
File
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",>
line 152, in run
state = self.result.wait_until_finish(duration=self.timeout_ms)
File
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",>
line 1557, in wait_until_finish
'{}'.format(consoleUrl))
AssertionError: Job did not reach to a terminal state after waiting
indefinitely. Console URL:
https://console.cloud.google.com/dataflow/jobs/<RegionId>/2023-05-09_08_09_09-2319503586605036788?project=<ProjectId>
> Task :sdks:python:apache_beam:testing:load_tests:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
line: 63
* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 4h 55m 11s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date
Build scan background action failed.
java.lang.IllegalArgumentException:
com.gradle.enterprise.gradleplugin.internal.extension.a is not an interface
at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:590)
at java.lang.reflect.Proxy$ProxyClassFactory.apply(Proxy.java:557)
at java.lang.reflect.WeakCache$Factory.get(WeakCache.java:230)
at java.lang.reflect.WeakCache.get(WeakCache.java:127)
at java.lang.reflect.Proxy.getProxyClass0(Proxy.java:419)
at java.lang.reflect.Proxy.newProxyInstance(Proxy.java:719)
at
com.gradle.ProxyFactory$ProxyingInvocationHandler.createLocalProxy(ProxyFactory.java:66)
at
com.gradle.ProxyFactory$ProxyingInvocationHandler.lambda$adaptActionArg$1(ProxyFactory.java:61)
at
com.gradle.enterprise.gradleplugin.internal.extension.b$3.run(SourceFile:100)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Publishing build scan...
https://ge.apache.org/s/ugnby6zk2xb34
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]