See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/1212/display/redirect>

Changes:


------------------------------------------
[...truncated 55.17 KB...]
Collecting charset-normalizer<4,>=2 (from 
requests<3.0.0,>=2.24.0->apache-beam==2.52.0.dev0)
  Obtaining dependency information for charset-normalizer<4,>=2 from 
https://files.pythonhosted.org/packages/1e/c8/fd52271326c052f95f47ef718b018aa2bc3fd097d9bac44d7d48894c6130/charset_normalizer-3.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata
  Using cached 
charset_normalizer-3.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata
 (32 kB)
Collecting idna<4,>=2.5 (from requests<3.0.0,>=2.24.0->apache-beam==2.52.0.dev0)
  Using cached idna-3.4-py3-none-any.whl (61 kB)
Collecting urllib3<3,>=1.21.1 (from 
requests<3.0.0,>=2.24.0->apache-beam==2.52.0.dev0)
  Obtaining dependency information for urllib3<3,>=1.21.1 from 
https://files.pythonhosted.org/packages/26/40/9957270221b6d3e9a3b92fdfba80dd5c9661ff45a664b47edd5d00f707f5/urllib3-2.0.6-py3-none-any.whl.metadata
  Using cached urllib3-2.0.6-py3-none-any.whl.metadata (6.6 kB)
Collecting certifi>=2017.4.17 (from 
requests<3.0.0,>=2.24.0->apache-beam==2.52.0.dev0)
  Obtaining dependency information for certifi>=2017.4.17 from 
https://files.pythonhosted.org/packages/4c/dd/2234eab22353ffc7d94e8d13177aaa050113286e93e7b40eae01fbf7c3d9/certifi-2023.7.22-py3-none-any.whl.metadata
  Using cached certifi-2023.7.22-py3-none-any.whl.metadata (2.2 kB)
Collecting scipy>=1.5.0 (from scikit-learn>=0.20.0->apache-beam==2.52.0.dev0)
  Using cached 
scipy-1.10.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.5 MB)
Collecting threadpoolctl>=2.0.0 (from 
scikit-learn>=0.20.0->apache-beam==2.52.0.dev0)
  Obtaining dependency information for threadpoolctl>=2.0.0 from 
https://files.pythonhosted.org/packages/81/12/fd4dea011af9d69e1cad05c75f3f7202cdcbeac9b712eea58ca779a72865/threadpoolctl-3.2.0-py3-none-any.whl.metadata
  Using cached threadpoolctl-3.2.0-py3-none-any.whl.metadata (10.0 kB)
Collecting greenlet!=0.4.17 (from 
sqlalchemy<2.0,>=1.3->apache-beam==2.52.0.dev0)
  Obtaining dependency information for greenlet!=0.4.17 from 
https://files.pythonhosted.org/packages/75/5a/4ecc17749c54882dd3422befcd35fec5128ae2a72138844899322af0b55a/greenlet-3.0.0-cp38-cp38-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl.metadata
  Using cached 
greenlet-3.0.0-cp38-cp38-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl.metadata
 (3.8 kB)
Collecting docker>=4.0.0 (from 
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.52.0.dev0)
  Obtaining dependency information for docker>=4.0.0 from 
https://files.pythonhosted.org/packages/db/be/3032490fa33b36ddc8c4b1da3252c6f974e7133f1a50de00c6b85cca203a/docker-6.1.3-py3-none-any.whl.metadata
  Using cached docker-6.1.3-py3-none-any.whl.metadata (3.5 kB)
Collecting wrapt (from 
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.52.0.dev0)
  Using cached 
wrapt-1.15.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 (81 kB)
Collecting deprecation (from 
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.52.0.dev0)
  Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting pymysql (from 
testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.52.0.dev0)
  Obtaining dependency information for pymysql from 
https://files.pythonhosted.org/packages/e5/30/20467e39523d0cfc2b6227902d3687a16364307260c75e6a1cb4422b0c62/PyMySQL-1.1.0-py3-none-any.whl.metadata
  Using cached PyMySQL-1.1.0-py3-none-any.whl.metadata (4.4 kB)
Collecting urllib3<3,>=1.21.1 (from 
requests<3.0.0,>=2.24.0->apache-beam==2.52.0.dev0)
  Obtaining dependency information for urllib3<3,>=1.21.1 from 
https://files.pythonhosted.org/packages/48/fe/a5c6cc46e9fe9171d7ecf0f33ee7aae14642f8d74baa7af4d7840f9358be/urllib3-1.26.17-py2.py3-none-any.whl.metadata
  Using cached urllib3-1.26.17-py2.py3-none-any.whl.metadata (48 kB)
Collecting pycparser (from 
cffi>=1.12->cryptography>=41.0.2->apache-beam==2.52.0.dev0)
  Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB)
Collecting websocket-client>=0.32.0 (from 
docker>=4.0.0->testcontainers[mysql]<4.0.0,>=3.0.3->apache-beam==2.52.0.dev0)
  Obtaining dependency information for websocket-client>=0.32.0 from 
https://files.pythonhosted.org/packages/0b/50/49e0d7342e5d441d43b525d6c84656ea40aea3e58d530004d07b22bc9b04/websocket_client-1.6.3-py3-none-any.whl.metadata
  Using cached websocket_client-1.6.3-py3-none-any.whl.metadata (7.7 kB)
Collecting google-crc32c<2.0dev,>=1.0 (from 
google-resumable-media<3.0dev,>=0.6.0->google-cloud-bigquery<4,>=2.0.0->apache-beam==2.52.0.dev0)
  Using cached 
google_crc32c-1.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl 
(32 kB)
Collecting PyJWT[crypto]<3,>=1.0.0 (from 
msal<2.0.0,>=1.20.0->azure-identity<2,>=1.12.0->apache-beam==2.52.0.dev0)
  Obtaining dependency information for PyJWT[crypto]<3,>=1.0.0 from 
https://files.pythonhosted.org/packages/2b/4f/e04a8067c7c96c364cef7ef73906504e2f40d690811c021e1a1901473a19/PyJWT-2.8.0-py3-none-any.whl.metadata
  Using cached PyJWT-2.8.0-py3-none-any.whl.metadata (4.2 kB)
Collecting portalocker<3,>=1.0 (from 
msal-extensions<2.0.0,>=0.3.0->azure-identity<2,>=1.12.0->apache-beam==2.52.0.dev0)
  Obtaining dependency information for portalocker<3,>=1.0 from 
https://files.pythonhosted.org/packages/17/9e/87671efcca80ba6203811540ed1f9c0462c1609d2281d7b7f53cef05da3d/portalocker-2.8.2-py3-none-any.whl.metadata
  Using cached portalocker-2.8.2-py3-none-any.whl.metadata (8.5 kB)
Collecting pyasn1>=0.1.7 (from 
oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31->apache-beam==2.52.0.dev0)
  Using cached pyasn1-0.5.0-py2.py3-none-any.whl (83 kB)
Collecting backports.zoneinfo (from 
tzlocal>=1.2->js2py<1,>=0.74->apache-beam==2.52.0.dev0)
  Using cached backports.zoneinfo-0.2.1-cp38-cp38-manylinux1_x86_64.whl (74 kB)
Using cached azure_core-1.29.4-py3-none-any.whl (192 kB)
Using cached azure_identity-1.15.0b1-py3-none-any.whl (162 kB)
Using cached azure_storage_blob-12.18.2-py3-none-any.whl (392 kB)
Using cached boto3-1.28.61-py3-none-any.whl (135 kB)
Using cached cryptography-41.0.4-cp37-abi3-manylinux_2_28_x86_64.whl (4.4 MB)
Using cached 
fastavro-1.8.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.1 MB)
Using cached fasteners-0.19-py3-none-any.whl (18 kB)
Using cached google_api_core-2.12.0-py3-none-any.whl (121 kB)
Using cached google_auth-2.23.2-py2.py3-none-any.whl (181 kB)
Using cached google_auth_httplib2-0.1.1-py2.py3-none-any.whl (9.3 kB)
Using cached google_cloud_aiplatform-1.34.0-py2.py3-none-any.whl (3.1 MB)
Using cached google_cloud_bigquery-3.12.0-py2.py3-none-any.whl (220 kB)
Using cached google_cloud_bigquery_storage-2.22.0-py2.py3-none-any.whl (190 kB)
Using cached google_cloud_bigtable-2.21.0-py2.py3-none-any.whl (293 kB)
Using cached google_cloud_core-2.3.3-py2.py3-none-any.whl (29 kB)
Using cached google_cloud_datastore-2.18.0-py2.py3-none-any.whl (177 kB)
Using cached google_cloud_dlp-3.12.3-py2.py3-none-any.whl (143 kB)
Using cached google_cloud_language-2.11.1-py2.py3-none-any.whl (138 kB)
Using cached google_cloud_pubsub-2.18.4-py2.py3-none-any.whl (265 kB)
Using cached google_cloud_pubsublite-1.8.3-py2.py3-none-any.whl (288 kB)
Using cached google_cloud_recommendations_ai-0.10.5-py2.py3-none-any.whl (173 
kB)
Using cached google_cloud_spanner-3.40.1-py2.py3-none-any.whl (332 kB)
Using cached google_cloud_videointelligence-2.11.4-py2.py3-none-any.whl (229 kB)
Using cached google_cloud_vision-3.4.4-py2.py3-none-any.whl (444 kB)
Using cached hypothesis-6.87.3-py3-none-any.whl (420 kB)
Using cached joblib-1.3.2-py3-none-any.whl (302 kB)
Using cached mock-5.1.0-py3-none-any.whl (30 kB)
Using cached 
orjson-3.9.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (138 kB)
Using cached proto_plus-1.22.3-py3-none-any.whl (48 kB)
Using cached 
psycopg2_binary-2.9.9-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl 
(3.0 MB)
Using cached 
pymongo-4.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (687 kB)
Using cached pytest-7.4.2-py3-none-any.whl (324 kB)
Using cached pytest_xdist-3.3.1-py3-none-any.whl (41 kB)
Using cached pytz-2023.3.post1-py2.py3-none-any.whl (502 kB)
Using cached 
PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (736 kB)
Using cached 
regex-2023.10.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (776 
kB)
Using cached requests-2.31.0-py3-none-any.whl (62 kB)
Using cached requests_mock-1.11.0-py2.py3-none-any.whl (28 kB)
Using cached 
scikit_learn-1.3.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl 
(11.1 MB)
Using cached 
SQLAlchemy-1.4.49-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 (1.6 MB)
Using cached tenacity-8.2.3-py3-none-any.whl (24 kB)
Using cached typing_extensions-4.8.0-py3-none-any.whl (31 kB)
Using cached botocore-1.31.61-py3-none-any.whl (11.2 MB)
Using cached certifi-2023.7.22-py3-none-any.whl (158 kB)
Using cached 
cffi-1.16.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (444 kB)
Using cached 
charset_normalizer-3.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
 (137 kB)
Using cached dnspython-2.4.2-py3-none-any.whl (300 kB)
Using cached docker-6.1.3-py3-none-any.whl (148 kB)
Using cached exceptiongroup-1.1.3-py3-none-any.whl (14 kB)
Using cached execnet-2.0.2-py3-none-any.whl (37 kB)
Using cached google_cloud_resource_manager-1.10.4-py2.py3-none-any.whl (320 kB)
Using cached google_cloud_storage-2.11.0-py2.py3-none-any.whl (118 kB)
Using cached google_resumable_media-2.6.0-py2.py3-none-any.whl (80 kB)
Using cached googleapis_common_protos-1.60.0-py2.py3-none-any.whl (227 kB)
Using cached 
greenlet-3.0.0-cp38-cp38-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl (617 
kB)
Using cached grpcio_status-1.59.0-py3-none-any.whl (14 kB)
Using cached msal-1.24.1-py2.py3-none-any.whl (95 kB)
Using cached pyparsing-3.1.1-py3-none-any.whl (103 kB)
Using cached s3transfer-0.7.0-py3-none-any.whl (79 kB)
Using cached threadpoolctl-3.2.0-py3-none-any.whl (15 kB)
Using cached tzlocal-5.1-py3-none-any.whl (21 kB)
Using cached urllib3-1.26.17-py2.py3-none-any.whl (143 kB)
Using cached PyMySQL-1.1.0-py3-none-any.whl (44 kB)
Using cached portalocker-2.8.2-py3-none-any.whl (17 kB)
Using cached websocket_client-1.6.3-py3-none-any.whl (57 kB)
Using cached PyJWT-2.8.0-py3-none-any.whl (22 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: 
filename=apache_beam-2.52.0.dev0-py3-none-any.whl size=3331546 
sha256=a43bf2a0b0644eae3d427329c55fe72fcea1f98790ad24cbebeec8ce8ff0de2b
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/a6/31/80/c62534796659a7bd654a64a5c44b587857b5d93e8450d12360
Successfully built apache-beam
Installing collected packages: sortedcontainers, pytz, pyjsparser, docopt, 
crcmod, zstandard, wrapt, websocket-client, urllib3, typing-extensions, 
threadpoolctl, tenacity, sqlparse, six, shapely, scipy, regex, pyyaml, 
pyparsing, pymysql, PyJWT, pyhamcrest, pycparser, pyasn1, pyarrow, 
psycopg2-binary, proto-plus, portalocker, parameterized, overrides, orjson, 
objsize, mock, joblib, jmespath, iniconfig, idna, greenlet, 
googleapis-common-protos, google-crc32c, fasteners, fastavro, execnet, 
exceptiongroup, dnspython, dill, deprecation, cloudpickle, charset-normalizer, 
certifi, backports.zoneinfo, attrs, tzlocal, sqlalchemy, scikit-learn, rsa, 
requests, python-dateutil, pytest, pymongo, pydot, pyasn1-modules, isodate, 
hypothesis, httplib2, grpcio-status, google-resumable-media, cffi, 
requests_mock, pytest-xdist, pytest-timeout, pandas, oauth2client, js2py, hdfs, 
grpc-google-iam-v1, google-auth, freezegun, docker, cryptography, botocore, 
azure-core, testcontainers, s3transfer, google-auth-httplib2, google-apitools, 
google-api-core, azure-storage-blob, apache-beam, msal, google-cloud-core, 
boto3, msal-extensions, google-cloud-vision, google-cloud-videointelligence, 
google-cloud-storage, google-cloud-spanner, google-cloud-resource-manager, 
google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, 
google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, 
google-cloud-bigquery-storage, google-cloud-bigquery, google-cloud-pubsublite, 
google-cloud-aiplatform, azure-identity
Successfully installed PyJWT-2.8.0 apache-beam-2.52.0.dev0 attrs-23.1.0 
azure-core-1.29.4 azure-identity-1.15.0b1 azure-storage-blob-12.18.2 
backports.zoneinfo-0.2.1 boto3-1.28.61 botocore-1.31.61 certifi-2023.7.22 
cffi-1.16.0 charset-normalizer-3.3.0 cloudpickle-2.2.1 crcmod-1.7 
cryptography-41.0.4 deprecation-2.1.0 dill-0.3.1.1 dnspython-2.4.2 docker-6.1.3 
docopt-0.6.2 exceptiongroup-1.1.3 execnet-2.0.2 fastavro-1.8.4 fasteners-0.19 
freezegun-1.2.2 google-api-core-2.12.0 google-apitools-0.5.31 
google-auth-2.23.2 google-auth-httplib2-0.1.1 google-cloud-aiplatform-1.34.0 
google-cloud-bigquery-3.12.0 google-cloud-bigquery-storage-2.22.0 
google-cloud-bigtable-2.21.0 google-cloud-core-2.3.3 
google-cloud-datastore-2.18.0 google-cloud-dlp-3.12.3 
google-cloud-language-2.11.1 google-cloud-pubsub-2.18.4 
google-cloud-pubsublite-1.8.3 google-cloud-recommendations-ai-0.10.5 
google-cloud-resource-manager-1.10.4 google-cloud-spanner-3.40.1 
google-cloud-storage-2.11.0 google-cloud-videointelligence-2.11.4 
google-cloud-vision-3.4.4 google-crc32c-1.5.0 google-resumable-media-2.6.0 
googleapis-common-protos-1.60.0 greenlet-3.0.0 grpc-google-iam-v1-0.12.6 
grpcio-status-1.59.0 hdfs-2.7.2 httplib2-0.22.0 hypothesis-6.87.3 idna-3.4 
iniconfig-2.0.0 isodate-0.6.1 jmespath-1.0.1 joblib-1.3.2 js2py-0.74 mock-5.1.0 
msal-1.24.1 msal-extensions-1.0.0 oauth2client-4.1.3 objsize-0.6.1 orjson-3.9.7 
overrides-6.5.0 pandas-1.5.3 parameterized-0.9.0 portalocker-2.8.2 
proto-plus-1.22.3 psycopg2-binary-2.9.9 pyarrow-11.0.0 pyasn1-0.5.0 
pyasn1-modules-0.3.0 pycparser-2.21 pydot-1.4.2 pyhamcrest-2.0.4 
pyjsparser-2.7.1 pymongo-4.5.0 pymysql-1.1.0 pyparsing-3.1.1 pytest-7.4.2 
pytest-timeout-2.1.0 pytest-xdist-3.3.1 python-dateutil-2.8.2 pytz-2023.3.post1 
pyyaml-6.0.1 regex-2023.10.3 requests-2.31.0 requests_mock-1.11.0 rsa-4.9 
s3transfer-0.7.0 scikit-learn-1.3.1 scipy-1.10.1 shapely-1.8.5.post1 six-1.16.0 
sortedcontainers-2.4.0 sqlalchemy-1.4.49 sqlparse-0.4.4 tenacity-8.2.3 
testcontainers-3.7.1 threadpoolctl-3.2.0 typing-extensions-4.8.0 tzlocal-5.1 
urllib3-1.26.17 websocket-client-1.6.3 wrapt-1.15.0 zstandard-0.21.0

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional 
dependencies to be installed in SDK **** container, consider using the SDK 
container image pre-building workflow to avoid repetitive installations. Learn 
more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230927
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/beam_python3.8_sdk:beam-master-20230927" for 
Docker environment
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function pack_combiners at 0x7f303af970d0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:====================
 <function sort_stages at 0x7f303af978b0> ====================
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-write-python-batch-10gb1006155343.1696617971.552925/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-write-python-batch-10gb1006155343.1696617971.552925/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-write-python-batch-10gb1006155343.1696617971.552925/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-write-python-batch-10gb1006155343.1696617971.552925/pipeline.pb
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20231006184611553928-4777'
 createTime: '2023-10-06T18:46:12.690029Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2023-10-06_11_46_12-8651577650799876510'
 location: 'us-central1'
 name: 'performance-tests-bqio-write-python-batch-10gb1006155343'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2023-10-06T18:46:12.690029Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2023-10-06_11_46_12-8651577650799876510]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2023-10-06_11_46_12-8651577650799876510
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-10-06_11_46_12-8651577650799876510?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2023-10-06_11_46_12-8651577650799876510 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:46:16.044Z: 
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:46:19.348Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse+Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at 
core.py:3759>)+Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:46:19.373Z: 
JOB_MESSAGE_BASIC: Executing operation Produce rows/Impulse+Produce 
rows/EmitSource+ref_AppliedPTransform_Produce-rows-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction+ref_AppliedPTransform_Produce-rows-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:46:19.395Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse+Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at 
core.py:3759>)+Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode)+Write to 
BigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/GenerateFilePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/SchemaModJobNamePrefix
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:46:19.407Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2023-10-06_11_46_12-8651577650799876510 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:46:31.589Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:23.688Z: 
JOB_MESSAGE_BASIC: All ****s have finished the startup processes and began to 
receive work requests.
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:23.866Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse+Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at 
core.py:3759>)+Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:24.734Z: 
JOB_MESSAGE_BASIC: Finished operation Produce rows/Impulse+Produce 
rows/EmitSource+ref_AppliedPTransform_Produce-rows-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction+ref_AppliedPTransform_Produce-rows-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:24.895Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse+Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at 
core.py:3759>)+Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode)+Write to 
BigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/GenerateFilePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/SchemaModJobNamePrefix
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.037Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.063Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.084Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.110Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.112Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.112Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.132Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.142Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.158Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.163Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.188Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.208Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.323Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.855Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:25.957Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:26.319Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:49:26.427Z: 
JOB_MESSAGE_BASIC: Executing operation 
ref_AppliedPTransform_Produce-rows-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing+Count
 messages+Format+Measure time+Write to 
BigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+Write to 
BigQuery/BigQueryBatchFileLoads/AppendDestination+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write
 to BigQuery/BigQueryBatchFileLoads/IdentityWorkaround+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:02.530Z: 
JOB_MESSAGE_BASIC: Finished operation 
ref_AppliedPTransform_Produce-rows-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing+Count
 messages+Format+Measure time+Write to 
BigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+Write to 
BigQuery/BigQueryBatchFileLoads/AppendDestination+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write
 to BigQuery/BigQueryBatchFileLoads/IdentityWorkaround+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:02.596Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:02.638Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:02.683Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+Write to 
BigQuery/BigQueryBatchFileLoads/DropShardNumber+Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile+Write to 
BigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/InputIdentity+Write to 
BigQuery/BigQueryBatchFileLoads/IdentityWorkaround+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:02.907Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+Write to 
BigQuery/BigQueryBatchFileLoads/DropShardNumber+Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile+Write to 
BigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/InputIdentity+Write to 
BigQuery/BigQueryBatchFileLoads/IdentityWorkaround+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:02.953Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:04.246Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:04.296Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:04.419Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:04.516Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+Write
 to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)+Write
 to BigQuery/BigQueryBatchFileLoads/MapTuple(<lambda at 
bigquery_file_loads.py:1111>)+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)+Write to 
BigQuery/BigQueryBatchFileLoads/GroupByKey/Write+Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:26.905Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+Write
 to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)+Write
 to BigQuery/BigQueryBatchFileLoads/MapTuple(<lambda at 
bigquery_file_loads.py:1111>)+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(UpdateDestinationSchema)+Write to 
BigQuery/BigQueryBatchFileLoads/GroupByKey/Write+Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/ParDo(TriggerLoadJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:27.040Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:27.065Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input1
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:27.083Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:27.091Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:27.112Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)/View-python_side_input1
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:27.180Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:27.200Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupByKey/Read+Write to 
BigQuery/BigQueryBatchFileLoads/MapTuple(<lambda at 
bigquery_file_loads.py:1116>)+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:28.416Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupByKey/Read+Write to 
BigQuery/BigQueryBatchFileLoads/MapTuple(<lambda at 
bigquery_file_loads.py:1116>)+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:28.510Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:28.560Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue/View-python_side_input0
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:28.651Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:28.806Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:28.900Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:29.197Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:29.245Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:29.289Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:29.340Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+Write 
to BigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:29.496Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+Write 
to BigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames/Keys+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:53:29.650Z: 
JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2023-10-06T18:55:55.569Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2023-10-06_11_46_12-8651577650799876510 is in state JOB_STATE_DONE
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results 
for test: a818769bcb354ec7ac45726121b023af and timestamp: 1696618563.3284833:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: 
bqio_write_10GB_results_count_messages_total_messages Value: 10485760
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: 
bqio_write_10GB_results_runtime Value: 196
INFO:apache_beam.io.gcp.tests.utils:Clean up a BigQuery table with project: 
apache-beam-testing, dataset: beam_performance, table: bqio_write_10GB.

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 62

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> Could not get unknown property 'execResult' for task 
> ':sdks:python:apache_beam:testing:load_tests:run' of type 
> org.gradle.api.tasks.Exec.

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Get more help at https://help.gradle.org.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 12m 20s
9 actionable tasks: 8 executed, 1 from cache

Publishing build scan...
https://ge.apache.org/s/ofr4efdbuqqlk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]


Reply via email to