See 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/682/display/redirect>

Changes:


------------------------------------------
[...truncated 33.18 KB...]
> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional 
dependencies to be installed in SDK **** container, consider using the SDK 
container image pre-building workflow to avoid repetitive installations. Learn 
more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.42.0.dev
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker 
environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0731125449.1659272495.024946/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0731125449.1659272495.024946/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0731125449.1659272495.024946/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-streaming-pardo-1-0731125449.1659272495.024946/pipeline.pb
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 clientRequestId: '20220731130135026006-1691'
 createTime: '2022-07-31T13:01:35.924784Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2022-07-31_06_01_35-280142601711712376'
 location: 'us-central1'
 name: 'load-tests-python-dataflow-streaming-pardo-1-0731125449'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2022-07-31T13:01:35.924784Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2022-07-31_06_01_35-280142601711712376]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2022-07-31_06_01_35-280142601711712376
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-31_06_01_35-280142601711712376?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2022-07-31_06_01_35-280142601711712376 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:39.629Z: 
JOB_MESSAGE_BASIC: Worker configuration: e2-standard-2 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.460Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable 
parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.503Z: 
JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.625Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.665Z: 
JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into 
optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.746Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write 
steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.781Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.832Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.882Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.923Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Read synthetic/Map(<lambda at 
iobase.py:908>) into Read synthetic/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:40.969Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
 into Read synthetic/Map(<lambda at iobase.py:908>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.014Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/SplitWithSizing
 into 
ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/PairWithRestriction
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.058Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into 
ref_AppliedPTransform_Read-synthetic-SDFBoundedSourceReader-ParDo-SDFBoundedSourceDoFn-_6/ProcessElementAndRestrictionWithSizing
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.102Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 0 into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.148Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 1 into Step: 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.193Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 2 into Step: 1
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.238Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 3 into Step: 2
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.292Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 4 into Step: 3
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.329Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 5 into Step: 4
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.362Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 6 into Step: 5
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.400Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 7 into Step: 6
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.444Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 8 into Step: 7
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.488Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Step: 9 into Step: 8
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.532Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End into Step: 9
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.720Z: 
JOB_MESSAGE_BASIC: Running job using Streaming Engine
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.753Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.794Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.826Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.859Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.926Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:41.958Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:01:42.010Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2022-07-31_06_01_35-280142601711712376 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:02:02.573Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:02:18.605Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:02:18.636Z: 
JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5.  This could be 
a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:02:28.786Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 3 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:02:28.819Z: 
JOB_MESSAGE_DETAILED: Resized **** pool to 3, though goal was 5.  This could be 
a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:02:38.980Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 so that the 
pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:02:44.804Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:15:00.139Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:15:00.241Z: 
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:15:00.260Z: 
JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:15:00.282Z: 
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:15:00.305Z: 
JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:15:32.723Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of ****s to 0 based on 
low average **** CPU utilization, and the pipeline having sufficiently low 
backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:15:32.761Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2022-07-31T13:15:32.798Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2022-07-31_06_01_35-280142601711712376 is in state JOB_STATE_DONE
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results 
for test: 52a926dbdb894255ba4a3abfeabd54a5 and timestamp: 1659273351.7711833:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: 
python_dataflow_streaming_pardo_1_runtime Value: 225

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD SUCCESSFUL in 15m 26s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dlgcgfmk7acgu

Build step 'Invoke Gradle script' changed build result to SUCCESS
[beam_LoadTests_Python_ParDo_Dataflow_Streaming] $ /bin/bash -xe 
/tmp/jenkins6046482883796857570.sh
+ echo '*** ParDo Python Load test: 2GB 100 byte records 200 times ***'
*** ParDo Python Load test: 2GB 100 byte records 200 times ***
[Gradle] - Launching build.
[src] $ 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/gradlew>
 -PloadTest.mainClass=apache_beam.testing.load_tests.pardo_test 
-Prunner=DataflowRunner 
'-PloadTest.args=--job_name=load-tests-python-dataflow-streaming-pardo-2-0731125449
 --project=apache-beam-testing --region=us-central1 
--temp_location=gs://temp-storage-for-perf-tests/loadtests 
--publish_to_big_query=true --metrics_dataset=load_test 
--metrics_table=python_dataflow_streaming_pardo_2 
--influx_measurement=python_streaming_pardo_2 --input_options='{"num_records": 
20000000,"key_size": 10,"value_size": 90}' --iterations=200 
--number_of_counter_operations=0 --number_of_counters=0 --num_****s=5 
--autoscaling_algorithm=NONE --influx_db_name=beam_test_metrics 
--influx_hostname=http://10.128.0.96:8086 --streaming 
--experiments=use_runner_v2,shuffle_mode=appliance --runner=DataflowRunner' 
-PpythonVersion=3.7 --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g 
-Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses 
:sdks:python:apache_beam:testing:load_tests:run
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy UP-TO-DATE
> Task :buildSrc:pluginDescriptors UP-TO-DATE
> Task :buildSrc:processResources UP-TO-DATE
> Task :buildSrc:classes UP-TO-DATE
> Task :buildSrc:jar UP-TO-DATE
> Task :buildSrc:assemble UP-TO-DATE
> Task :buildSrc:spotlessGroovy UP-TO-DATE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata UP-TO-DATE
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
> Task :sdks:python:setupVirtualenv UP-TO-DATE

> Task :sdks:python:sdist
<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1922375555/lib/python3.7/site-packages/setuptools/dist.py>:530:
 UserWarning: Normalizing '2.42.0.dev' to '2.42.0.dev0'
  warnings.warn(tmpl.format(**locals()))
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: no files found matching 'LICENSE.python'
warning: sdist: standard file not found: should have one of README, README.rst, 
README.txt, README.md


> Task :sdks:python:apache_beam:testing:load_tests:setupVirtualenv
Collecting pip
  Using cached pip-22.2.1-py3-none-any.whl (2.0 MB)
Installing collected packages: pip
  Attempting uninstall: pip
    Found existing installation: pip 20.1.1
    Uninstalling pip-20.1.1:
      Successfully uninstalled pip-20.1.1
Successfully installed pip-22.2.1
Collecting tox==3.20.1
  Using cached tox-3.20.1-py2.py3-none-any.whl (83 kB)
Requirement already satisfied: setuptools in 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from -r 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build-requirements.txt>
 (line 20)) (47.1.0)
Collecting setuptools
  Using cached setuptools-63.2.0-py3-none-any.whl (1.2 MB)
Collecting wheel>=0.36.0
  Using cached wheel-0.37.1-py2.py3-none-any.whl (35 kB)
Collecting grpcio-tools==1.37.0
  Using cached grpcio_tools-1.37.0-cp37-cp37m-manylinux2014_x86_64.whl (2.5 MB)
Collecting mypy-protobuf==1.18
  Using cached mypy_protobuf-1.18-py3-none-any.whl (7.3 kB)
Collecting distlib==0.3.1
  Using cached distlib-0.3.1-py2.py3-none-any.whl (335 kB)
Collecting 
virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0
  Using cached virtualenv-20.16.2-py2.py3-none-any.whl (8.8 MB)
Collecting importlib-metadata<3,>=0.12
  Using cached importlib_metadata-2.1.3-py2.py3-none-any.whl (10 kB)
Collecting six>=1.14.0
  Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting packaging>=14
  Using cached packaging-21.3-py3-none-any.whl (40 kB)
Collecting pluggy>=0.12.0
  Using cached pluggy-1.0.0-py2.py3-none-any.whl (13 kB)
Collecting toml>=0.9.4
  Using cached toml-0.10.2-py2.py3-none-any.whl (16 kB)
Collecting filelock>=3.0.0
  Using cached filelock-3.7.1-py3-none-any.whl (10 kB)
Collecting py>=1.4.17
  Using cached py-1.11.0-py2.py3-none-any.whl (98 kB)
Collecting protobuf<4.0dev,>=3.5.0.post1
  Using cached 
protobuf-3.20.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.0 MB)
Collecting grpcio>=1.37.0
  Using cached 
grpcio-1.48.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.6 MB)
Collecting zipp>=0.5
  Using cached zipp-3.8.1-py3-none-any.whl (5.6 kB)
Collecting pyparsing!=3.0.5,>=2.0.2
  Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB)
Collecting platformdirs<3,>=2
  Using cached platformdirs-2.5.2-py3-none-any.whl (14 kB)
Installing collected packages: distlib, zipp, wheel, toml, six, setuptools, 
pyparsing, py, protobuf, platformdirs, filelock, packaging, mypy-protobuf, 
importlib-metadata, grpcio, virtualenv, pluggy, grpcio-tools, tox
  Attempting uninstall: setuptools
    Found existing installation: setuptools 47.1.0
    Uninstalling setuptools-47.1.0:
      Successfully uninstalled setuptools-47.1.0
Successfully installed distlib-0.3.1 filelock-3.7.1 grpcio-1.48.0 
grpcio-tools-1.37.0 importlib-metadata-2.1.3 mypy-protobuf-1.18 packaging-21.3 
platformdirs-2.5.2 pluggy-1.0.0 protobuf-3.20.1 py-1.11.0 pyparsing-3.0.9 
setuptools-63.2.0 six-1.16.0 toml-0.10.2 tox-3.20.1 virtualenv-20.16.2 
wheel-0.37.1 zipp-3.8.1

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest
Processing 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz>
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
Collecting crcmod<2.0,>=1.7
  Using cached crcmod-1.7-cp37-cp37m-linux_x86_64.whl
Collecting orjson<4.0
  Downloading orjson-3.7.11.tar.gz (946 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 946.6/946.6 kB 17.2 MB/s eta 0:00:00
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'error'
  error: subprocess-exited-with-error
  
  × Preparing metadata (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [6 lines of output]
      
      Cargo, the Rust package manager, is not installed or is not on PATH.
      This package requires Rust and Cargo to compile extensions. Install it 
through
      the system's package manager or via https://rustup.rs/
      
      Checking for Rust toolchain....
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem 
with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

> Task :sdks:python:apache_beam:testing:load_tests:installGcpTest FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:python:apache_beam:testing:load_tests:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 22s
14 actionable tasks: 3 executed, 11 up-to-date

Publishing build scan...
https://gradle.com/s/wchuyefenbq4k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to