See 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/688/display/redirect?page=changes>

Changes:

[naireenhussain] add new pubsub urn

[Pablo Estrada] Several requests to show experiments in Dataflow UI

[byronellis] Add org.pentaho to calcite relocated packages to fix vendoring

[noreply] Adding VladMatyunin as collaborator (#22239)

[noreply] Mark session runner as deprecated (#22242)

[noreply] Update google-cloud-core dependency to <3 (#22237)

[noreply] Move WC integration test to generic registration (#22248)

[noreply] Move Xlang Go examples to generic registration (#22249)


------------------------------------------
[...truncated 76.79 KB...]
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of mock to determine which version is 
compatible with other requirements. This could take a while.
  Using cached httplib2-0.20.1-py3-none-any.whl (96 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: This is taking longer than usual. You might need to provide the 
dependency resolver with stricter constraints to reduce runtime. See 
https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort 
this run, press Ctrl + C.
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of hdfs to determine which version is 
compatible with other requirements. This could take a while.
Collecting hdfs<3.0.0,>=2.1.0
  Using cached hdfs-2.6.0-py3-none-any.whl (33 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of joblib to determine which version 
is compatible with other requirements. This could take a while.
  Using cached hdfs-2.5.8.tar.gz (41 kB)
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'done'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: This is taking longer than usual. You might need to provide the 
dependency resolver with stricter constraints to reduce runtime. See 
https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort 
this run, press Ctrl + C.
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of grpcio-gcp to determine which 
version is compatible with other requirements. This could take a while.
INFO: pip is looking at multiple versions of grpcio to determine which version 
is compatible with other requirements. This could take a while.
Collecting grpcio<2,>=1.33.1
  Using cached 
grpcio-1.47.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.5 MB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of httplib2 to determine which 
version is compatible with other requirements. This could take a while.
  Using cached 
grpcio-1.46.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.4 MB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: This is taking longer than usual. You might need to provide the 
dependency resolver with stricter constraints to reduce runtime. See 
https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort 
this run, press Ctrl + C.
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
INFO: pip is looking at multiple versions of google-cloud-vision to determine 
which version is compatible with other requirements. This could take a while.
Collecting google-cloud-vision<2,>=0.38.0
  Using cached google_cloud_vision-1.0.1-py2.py3-none-any.whl (435 kB)
WARNING: google-api-core 2.8.2 does not provide the extra 'grpcgcp'
  Using cached google_cloud_vision-1.0.0-py2.py3-none-any.whl (435 kB)
  Using cached google_cloud_vision-0.42.0-py2.py3-none-any.whl (435 kB)
  Using cached google_cloud_vision-0.41.0-py2.py3-none-any.whl (431 kB)
  Using cached google_cloud_vision-0.40.0-py2.py3-none-any.whl (431 kB)
  Using cached google_cloud_vision-0.39.0-py2.py3-none-any.whl (418 kB)
  Using cached google_cloud_vision-0.38.1-py2.py3-none-any.whl (413 kB)
  Using cached google_cloud_vision-0.38.0-py2.py3-none-any.whl (413 kB)
INFO: pip is looking at multiple versions of google-cloud-videointelligence to 
determine which version is compatible with other requirements. This could take 
a while.
Collecting google-cloud-videointelligence<2,>=1.8.0
  Using cached google_cloud_videointelligence-1.16.2-py2.py3-none-any.whl (183 
kB)
INFO: pip is looking at multiple versions of google-cloud-spanner to determine 
which version is compatible with other requirements. This could take a while.
Collecting google-cloud-spanner<2,>=1.13.0
  Using cached google_cloud_spanner-1.19.2-py2.py3-none-any.whl (255 kB)
INFO: pip is looking at multiple versions of google-cloud-recommendations-ai to 
determine which version is compatible with other requirements. This could take 
a while.
INFO: pip is looking at multiple versions of google-cloud-pubsublite to 
determine which version is compatible with other requirements. This could take 
a while.
Collecting google-cloud-pubsublite<2,>=1.2.0
  Using cached google_cloud_pubsublite-1.4.1-py2.py3-none-any.whl (265 kB)
INFO: pip is looking at multiple versions of google-cloud-pubsub to determine 
which version is compatible with other requirements. This could take a while.
Collecting google-cloud-pubsub<3,>=2.1.0
  Using cached google_cloud_pubsub-2.13.1-py2.py3-none-any.whl (234 kB)
Collecting google-crc32c<2.0dev,>=1.0
  Using cached 
google_crc32c-1.3.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl 
(38 kB)
Requirement already satisfied: zipp>=0.5 in 
<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
 (from importlib-metadata>=0.12->pytest<5.0,>=4.4.0->apache-beam==2.41.0.dev0) 
(3.8.1)
Collecting isodate>=0.6.0
  Using cached isodate-0.6.1-py2.py3-none-any.whl (41 kB)
Collecting requests-oauthlib>=0.5.0
  Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting typing-utils>=0.0.3
  Using cached typing_utils-0.1.0-py3-none-any.whl (10 kB)
Collecting oauthlib>=3.0.0
  Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: 
filename=apache_beam-2.41.0.dev0-py3-none-any.whl size=2822936 
sha256=499eee30f418c94be25225c442ae692d2fc48846f582fa7c97d24aaf24252833
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/e6/1f/28/9337974c607f5f016ef124646a645ebf2fb34bdd32df5f9dd8
Successfully built apache-beam
Installing collected packages: wcwidth, pytz, pyasn1, parameterized, docopt, 
crcmod, wrapt, websocket-client, urllib3, typing-utils, typing-extensions, 
threadpoolctl, tenacity, rsa, pyyaml, python-dateutil, pymysql, pymongo, 
pyhamcrest, pydot, pycparser, pyasn1-modules, psycopg2-binary, proto-plus, pbr, 
orjson, oauthlib, numpy, more-itertools, joblib, jmespath, isodate, idna, 
httplib2, greenlet, googleapis-common-protos, google-crc32c, fasteners, 
fastavro, execnet, dill, cloudpickle, charset-normalizer, certifi, cachetools, 
attrs, atomicwrites, sqlalchemy, scipy, requests, pyarrow, pluggy, pandas, 
overrides, oauth2client, mock, grpcio-status, grpcio-gcp, 
google-resumable-media, google-auth, freezegun, deprecation, cffi, botocore, 
scikit-learn, s3transfer, requests-oauthlib, requests_mock, pytest, hdfs, 
grpc-google-iam-v1, google-auth-httplib2, google-apitools, google-api-core, 
docker, cryptography, azure-core, testcontainers, pytest-timeout, 
pytest-forked, msrest, google-cloud-core, boto3, apache-beam, pytest-xdist, 
google-cloud-vision, google-cloud-videointelligence, google-cloud-spanner, 
google-cloud-recommendations-ai, google-cloud-pubsub, google-cloud-language, 
google-cloud-dlp, google-cloud-datastore, google-cloud-bigtable, 
google-cloud-bigquery-storage, google-cloud-bigquery, azure-storage-blob, 
google-cloud-pubsublite
  Attempting uninstall: pluggy
    Found existing installation: pluggy 1.0.0
    Uninstalling pluggy-1.0.0:
      Successfully uninstalled pluggy-1.0.0
Successfully installed apache-beam-2.41.0.dev0 atomicwrites-1.4.1 attrs-21.4.0 
azure-core-1.24.2 azure-storage-blob-12.13.0 boto3-1.24.28 botocore-1.27.28 
cachetools-4.2.4 certifi-2022.6.15 cffi-1.15.1 charset-normalizer-2.1.0 
cloudpickle-2.1.0 crcmod-1.7 cryptography-37.0.4 deprecation-2.1.0 dill-0.3.1.1 
docker-5.0.3 docopt-0.6.2 execnet-1.9.0 fastavro-1.5.2 fasteners-0.17.3 
freezegun-1.2.1 google-api-core-1.32.0 google-apitools-0.5.31 
google-auth-1.35.0 google-auth-httplib2-0.1.0 google-cloud-bigquery-2.34.4 
google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 
google-cloud-core-2.3.1 google-cloud-datastore-1.15.5 google-cloud-dlp-3.7.1 
google-cloud-language-1.3.2 google-cloud-pubsub-2.13.1 
google-cloud-pubsublite-1.4.2 google-cloud-recommendations-ai-0.2.0 
google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 
google-cloud-vision-1.0.2 google-crc32c-1.3.0 google-resumable-media-2.3.3 
googleapis-common-protos-1.56.4 greenlet-1.1.2 grpc-google-iam-v1-0.12.4 
grpcio-gcp-0.2.2 grpcio-status-1.47.0 hdfs-2.7.0 httplib2-0.20.4 idna-3.3 
isodate-0.6.1 jmespath-1.0.1 joblib-1.1.0 mock-2.0.0 more-itertools-8.13.0 
msrest-0.7.1 numpy-1.21.6 oauth2client-4.1.3 oauthlib-3.2.0 orjson-3.7.7 
overrides-6.1.0 pandas-1.3.5 parameterized-0.8.1 pbr-5.9.0 pluggy-0.13.1 
proto-plus-1.20.6 psycopg2-binary-2.9.3 pyarrow-7.0.0 pyasn1-0.4.8 
pyasn1-modules-0.2.8 pycparser-2.21 pydot-1.4.2 pyhamcrest-1.10.1 
pymongo-3.12.3 pymysql-1.0.2 pytest-4.6.11 pytest-forked-1.4.0 
pytest-timeout-1.4.2 pytest-xdist-1.34.0 python-dateutil-2.8.2 pytz-2022.1 
pyyaml-6.0 requests-2.28.1 requests-oauthlib-1.3.1 requests_mock-1.9.3 rsa-4.8 
s3transfer-0.6.0 scikit-learn-1.0.2 scipy-1.7.3 sqlalchemy-1.4.39 
tenacity-5.1.5 testcontainers-3.6.0 threadpoolctl-3.1.0 typing-extensions-4.3.0 
typing-utils-0.1.0 urllib3-1.26.10 wcwidth-0.2.5 websocket-client-1.3.3 
wrapt-1.14.1

> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
INFO:apache_beam.runners.dataflow.dataflow_runner:Pipeline has additional 
dependencies to be installed in SDK **** container, consider using the SDK 
container image pre-building workflow to avoid repetitive installations. Learn 
more on 
https://cloud.google.com/dataflow/docs/guides/using-custom-containers#prebuild
INFO:root:Default Python SDK image for environment is 
apache/beam_python3.7_sdk:2.41.0.dev
INFO:root:Using provided Python SDK container image: 
gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617
INFO:root:Python SDK container image set to 
"gcr.io/cloud-dataflow/v1beta3/python37-fnapi:beam-master-20220617" for Docker 
environment
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/smoketests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0713150141.1657724979.935813/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0713150141.1657724979.935813/dataflow_python_sdk.tar
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0713150141.1657724979.935813/pipeline.pb...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/smoketests/load-tests-python-dataflow-streaming-combine-1-0713150141.1657724979.935813/pipeline.pb
 in 0 seconds.
usage: combine_test.py [-h] [--runner RUNNER] [--streaming]
                       [--resource_hint RESOURCE_HINTS]
                       [--beam_services BEAM_SERVICES]
                       [--type_check_strictness {ALL_REQUIRED,DEFAULT_TO_ANY}]
                       [--type_check_additional TYPE_CHECK_ADDITIONAL]
                       [--no_pipeline_type_check] [--runtime_type_check]
                       [--performance_runtime_type_check]
                       [--allow_non_deterministic_key_coders]
                       [--allow_unsafe_triggers]
                       [--no_direct_runner_use_stacked_bundle]
                       [--direct_runner_bundle_repeat 
DIRECT_RUNNER_BUNDLE_REPEAT]
                       [--direct_num_****s DIRECT_NUM_WORKERS]
                       [--direct_running_mode 
{in_memory,multi_threading,multi_processing}]
                       [--direct_embed_docker_python]
                       [--dataflow_endpoint DATAFLOW_ENDPOINT]
                       [--project PROJECT] [--job_name JOB_NAME]
                       [--staging_location STAGING_LOCATION]
                       [--temp_location TEMP_LOCATION] [--region REGION]
                       [--service_account_email SERVICE_ACCOUNT_EMAIL]
                       [--no_auth] [--template_location TEMPLATE_LOCATION]
                       [--label LABELS] [--update]
                       [--transform_name_mapping TRANSFORM_NAME_MAPPING]
                       [--enable_streaming_engine]
                       [--dataflow_kms_key DATAFLOW_KMS_KEY]
                       [--create_from_snapshot CREATE_FROM_SNAPSHOT]
                       [--flexrs_goal {COST_OPTIMIZED,SPEED_OPTIMIZED}]
                       [--dataflow_service_option DATAFLOW_SERVICE_OPTIONS]
                       [--enable_hot_key_logging] [--enable_artifact_caching]
                       [--impersonate_service_account 
IMPERSONATE_SERVICE_ACCOUNT]
                       [--hdfs_host HDFS_HOST] [--hdfs_port HDFS_PORT]
                       [--hdfs_user HDFS_USER] [--hdfs_full_urls]
                       [--num_****s NUM_WORKERS]
                       [--max_num_****s MAX_NUM_WORKERS]
                       [--autoscaling_algorithm {NONE,THROUGHPUT_BASED}]
                       [--****_machine_type MACHINE_TYPE]
                       [--disk_size_gb DISK_SIZE_GB]
                       [--****_disk_type DISK_TYPE]
                       [--****_region WORKER_REGION]
                       [--****_zone WORKER_ZONE] [--zone ZONE]
                       [--network NETWORK] [--subnetwork SUBNETWORK]
                       [--****_harness_container_image 
WORKER_HARNESS_CONTAINER_IMAGE]
                       [--sdk_container_image SDK_CONTAINER_IMAGE]
                       [--sdk_harness_container_image_overrides 
SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
                       [--default_sdk_harness_log_level 
DEFAULT_SDK_HARNESS_LOG_LEVEL]
                       [--sdk_harness_log_level_overrides 
SDK_HARNESS_LOG_LEVEL_OVERRIDES]
                       [--use_public_ips] [--no_use_public_ips]
                       [--min_cpu_platform MIN_CPU_PLATFORM]
                       [--dataflow_****_jar DATAFLOW_WORKER_JAR]
                       [--dataflow_job_file DATAFLOW_JOB_FILE]
                       [--experiment EXPERIMENTS]
                       [--number_of_****_harness_threads 
NUMBER_OF_WORKER_HARNESS_THREADS]
                       [--profile_cpu] [--profile_memory]
                       [--profile_location PROFILE_LOCATION]
                       [--profile_sample_rate PROFILE_SAMPLE_RATE]
                       [--requirements_file REQUIREMENTS_FILE]
                       [--requirements_cache REQUIREMENTS_CACHE]
                       [--requirements_cache_only_sources]
                       [--setup_file SETUP_FILE] [--beam_plugin BEAM_PLUGINS]
                       [--pickle_library {cloudpickle,default,dill}]
                       [--save_main_session] [--sdk_location SDK_LOCATION]
                       [--extra_package EXTRA_PACKAGES]
                       [--prebuild_sdk_container_engine 
PREBUILD_SDK_CONTAINER_ENGINE]
                       [--prebuild_sdk_container_base_image 
PREBUILD_SDK_CONTAINER_BASE_IMAGE]
                       [--cloud_build_machine_type CLOUD_BUILD_MACHINE_TYPE]
                       [--docker_registry_push_url DOCKER_REGISTRY_PUSH_URL]
                       [--job_endpoint JOB_ENDPOINT]
                       [--artifact_endpoint ARTIFACT_ENDPOINT]
                       [--job_server_timeout JOB_SERVER_TIMEOUT]
                       [--environment_type ENVIRONMENT_TYPE]
                       [--environment_config ENVIRONMENT_CONFIG]
                       [--environment_option ENVIRONMENT_OPTIONS]
                       [--sdk_****_parallelism SDK_WORKER_PARALLELISM]
                       [--environment_cache_millis ENVIRONMENT_CACHE_MILLIS]
                       [--output_executable_path OUTPUT_EXECUTABLE_PATH]
                       [--artifacts_dir ARTIFACTS_DIR] [--job_port JOB_PORT]
                       [--artifact_port ARTIFACT_PORT]
                       [--expansion_port EXPANSION_PORT]
                       [--job_server_java_launcher JOB_SERVER_JAVA_LAUNCHER]
                       [--job_server_jvm_properties JOB_SERVER_JVM_PROPERTIES]
                       [--flink_master FLINK_MASTER]
                       [--flink_version {1.12,1.13,1.14,1.15}]
                       [--flink_job_server_jar FLINK_JOB_SERVER_JAR]
                       [--flink_submit_uber_jar]
                       [--spark_master_url SPARK_MASTER_URL]
                       [--spark_job_server_jar SPARK_JOB_SERVER_JAR]
                       [--spark_submit_uber_jar]
                       [--spark_rest_url SPARK_REST_URL]
                       [--spark_version {2,3}]
                       [--on_success_matcher ON_SUCCESS_MATCHER]
                       [--dry_run DRY_RUN]
                       [--wait_until_finish_duration WAIT_UNTIL_FINISH_DURATION]
                       [--pubsub_root_url PUBSUBROOTURL]
                       [--s3_access_key_id S3_ACCESS_KEY_ID]
                       [--s3_secret_access_key S3_SECRET_ACCESS_KEY]
                       [--s3_session_token S3_SESSION_TOKEN]
                       [--s3_endpoint_url S3_ENDPOINT_URL]
                       [--s3_region_name S3_REGION_NAME]
                       [--s3_api_version S3_API_VERSION]
                       [--s3_verify S3_VERIFY] [--s3_disable_ssl]
                       [--publish_to_big_query PUBLISH_TO_BIG_QUERY]
                       [--metrics_dataset METRICS_DATASET]
                       [--metrics_table METRICS_TABLE]
                       [--influx_measurement INFLUX_MEASUREMENT]
                       [--influx_db_name INFLUX_DB_NAME]
                       [--influx_hostname INFLUX_HOSTNAME]
                       [--input_options INPUT_OPTIONS]
                       [--timeout_ms TIMEOUT_MS] [--top_count=20 TOP_COUNT=20]
combine_test.py: error: argument --top_count=20: expected one argument

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 13s
15 actionable tasks: 9 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/g6koowxtosg5u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to