damccorm opened a new issue, #22692:
URL: https://github.com/apache/beam/issues/22692

   ### What happened?
   
   The last 2 runs of several python dataflow streaming load tests have failed:
   
   - 
https://ci-beam.apache.org/job/beam_LoadTests_Python_ParDo_Dataflow_Streaming/
   - 
https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Streaming/
   - 
https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_Dataflow_Streaming/
   - 
https://ci-beam.apache.org/job/beam_LoadTests_Python_Combine_Dataflow_Streaming/
   
   Generally the errors are of the form:
   
   ```
   08:06:33 usage: group_by_key_test.py [-h] [--runner RUNNER] [--streaming]
   08:06:33                             [--resource_hint RESOURCE_HINTS]
   08:06:33                             [--beam_services BEAM_SERVICES]
   08:06:33                             [--type_check_strictness 
{ALL_REQUIRED,DEFAULT_TO_ANY}]
   08:06:33                             [--type_check_additional 
TYPE_CHECK_ADDITIONAL]
   08:06:33                             [--no_pipeline_type_check] 
[--runtime_type_check]
   08:06:33                             [--performance_runtime_type_check]
   08:06:33                             [--allow_non_deterministic_key_coders]
   08:06:33                             [--allow_unsafe_triggers]
   08:06:33                             [--no_direct_runner_use_stacked_bundle]
   08:06:33                             [--direct_runner_bundle_repeat 
DIRECT_RUNNER_BUNDLE_REPEAT]
   08:06:33                             [--direct_num_****s DIRECT_NUM_WORKERS]
   08:06:33                             [--direct_running_mode 
{in_memory,multi_threading,multi_processing}]
   08:06:33                             [--direct_embed_docker_python]
   08:06:33                             [--dataflow_endpoint DATAFLOW_ENDPOINT]
   08:06:33                             [--project PROJECT] [--job_name 
JOB_NAME]
   08:06:33                             [--staging_location STAGING_LOCATION]
   08:06:33                             [--temp_location TEMP_LOCATION] 
[--region REGION]
   08:06:33                             [--service_account_email 
SERVICE_ACCOUNT_EMAIL]
   08:06:33                             [--no_auth]
   08:06:33                             [--template_location TEMPLATE_LOCATION]
   08:06:33                             [--label LABELS] [--update]
   08:06:33                             [--transform_name_mapping 
TRANSFORM_NAME_MAPPING]
   08:06:33                             [--enable_streaming_engine]
   08:06:33                             [--dataflow_kms_key DATAFLOW_KMS_KEY]
   08:06:33                             [--create_from_snapshot 
CREATE_FROM_SNAPSHOT]
   08:06:33                             [--flexrs_goal 
{COST_OPTIMIZED,SPEED_OPTIMIZED}]
   08:06:33                             [--dataflow_service_option 
DATAFLOW_SERVICE_OPTIONS]
   08:06:33                             [--enable_hot_key_logging]
   08:06:33                             [--enable_artifact_caching]
   08:06:33                             [--impersonate_service_account 
IMPERSONATE_SERVICE_ACCOUNT]
   08:06:33                             [--hdfs_host HDFS_HOST] [--hdfs_port 
HDFS_PORT]
   08:06:33                             [--hdfs_user HDFS_USER] 
[--hdfs_full_urls]
   08:06:33                             [--num_****s NUM_WORKERS]
   08:06:33                             [--max_num_****s MAX_NUM_WORKERS]
   08:06:33                             [--autoscaling_algorithm 
{NONE,THROUGHPUT_BASED}]
   08:06:33                             [--****_machine_type MACHINE_TYPE]
   08:06:33                             [--disk_size_gb DISK_SIZE_GB]
   08:06:33                             [--****_disk_type DISK_TYPE]
   08:06:33                             [--****_region WORKER_REGION]
   08:06:33                             [--****_zone WORKER_ZONE] [--zone ZONE]
   08:06:33                             [--network NETWORK] [--subnetwork 
SUBNETWORK]
   08:06:33                             [--****_harness_container_image 
WORKER_HARNESS_CONTAINER_IMAGE]
   08:06:33                             [--sdk_container_image 
SDK_CONTAINER_IMAGE]
   08:06:33                             
[--sdk_harness_container_image_overrides SDK_HARNESS_CONTAINER_IMAGE_OVERRIDES]
   08:06:33                             [--default_sdk_harness_log_level 
DEFAULT_SDK_HARNESS_LOG_LEVEL]
   08:06:33                             [--sdk_harness_log_level_overrides 
SDK_HARNESS_LOG_LEVEL_OVERRIDES]
   08:06:33                             [--use_public_ips] [--no_use_public_ips]
   08:06:33                             [--min_cpu_platform MIN_CPU_PLATFORM]
   08:06:33                             [--dataflow_****_jar 
DATAFLOW_WORKER_JAR]
   08:06:33                             [--dataflow_job_file DATAFLOW_JOB_FILE]
   08:06:33                             [--experiment EXPERIMENTS]
   08:06:33                             [--number_of_****_harness_threads 
NUMBER_OF_WORKER_HARNESS_THREADS]
   08:06:33                             [--profile_cpu] [--profile_memory]
   08:06:33                             [--profile_location PROFILE_LOCATION]
   08:06:33                             [--profile_sample_rate 
PROFILE_SAMPLE_RATE]
   08:06:33                             [--requirements_file REQUIREMENTS_FILE]
   08:06:33                             [--requirements_cache 
REQUIREMENTS_CACHE]
   08:06:33                             [--requirements_cache_only_sources]
   08:06:33                             [--setup_file SETUP_FILE]
   08:06:33                             [--beam_plugin BEAM_PLUGINS]
   08:06:33                             [--pickle_library 
{cloudpickle,default,dill}]
   08:06:33                             [--save_main_session]
   08:06:33                             [--sdk_location SDK_LOCATION]
   08:06:33                             [--extra_package EXTRA_PACKAGES]
   08:06:33                             [--prebuild_sdk_container_engine 
PREBUILD_SDK_CONTAINER_ENGINE]
   08:06:33                             [--prebuild_sdk_container_base_image 
PREBUILD_SDK_CONTAINER_BASE_IMAGE]
   08:06:33                             [--cloud_build_machine_type 
CLOUD_BUILD_MACHINE_TYPE]
   08:06:33                             [--docker_registry_push_url 
DOCKER_REGISTRY_PUSH_URL]
   08:06:33                             [--job_endpoint JOB_ENDPOINT]
   08:06:33                             [--artifact_endpoint ARTIFACT_ENDPOINT]
   08:06:33                             [--job_server_timeout 
JOB_SERVER_TIMEOUT]
   08:06:33                             [--environment_type ENVIRONMENT_TYPE]
   08:06:33                             [--environment_config 
ENVIRONMENT_CONFIG]
   08:06:33                             [--environment_option 
ENVIRONMENT_OPTIONS]
   08:06:33                             [--sdk_****_parallelism 
SDK_WORKER_PARALLELISM]
   08:06:33                             [--environment_cache_millis 
ENVIRONMENT_CACHE_MILLIS]
   08:06:33                             [--output_executable_path 
OUTPUT_EXECUTABLE_PATH]
   08:06:33                             [--artifacts_dir ARTIFACTS_DIR]
   08:06:33                             [--job_port JOB_PORT]
   08:06:33                             [--artifact_port ARTIFACT_PORT]
   08:06:33                             [--expansion_port EXPANSION_PORT]
   08:06:33                             [--job_server_java_launcher 
JOB_SERVER_JAVA_LAUNCHER]
   08:06:33                             [--job_server_jvm_properties 
JOB_SERVER_JVM_PROPERTIES]
   08:06:33                             [--flink_master FLINK_MASTER]
   08:06:33                             [--flink_version {1.12,1.13,1.14,1.15}]
   08:06:33                             [--flink_job_server_jar 
FLINK_JOB_SERVER_JAR]
   08:06:33                             [--flink_submit_uber_jar]
   08:06:33                             [--spark_master_url SPARK_MASTER_URL]
   08:06:33                             [--spark_job_server_jar 
SPARK_JOB_SERVER_JAR]
   08:06:33                             [--spark_submit_uber_jar]
   08:06:33                             [--spark_rest_url SPARK_REST_URL]
   08:06:33                             [--spark_version {2,3}]
   08:06:33                             [--on_success_matcher 
ON_SUCCESS_MATCHER]
   08:06:33                             [--dry_run DRY_RUN]
   08:06:33                             [--wait_until_finish_duration 
WAIT_UNTIL_FINISH_DURATION]
   08:06:33                             [--pubsub_root_url PUBSUBROOTURL]
   08:06:33                             [--s3_access_key_id S3_ACCESS_KEY_ID]
   08:06:33                             [--s3_secret_access_key 
S3_SECRET_ACCESS_KEY]
   08:06:33                             [--s3_session_token S3_SESSION_TOKEN]
   08:06:33                             [--s3_endpoint_url S3_ENDPOINT_URL]
   08:06:33                             [--s3_region_name S3_REGION_NAME]
   08:06:33                             [--s3_api_version S3_API_VERSION]
   08:06:33                             [--s3_verify S3_VERIFY] 
[--s3_disable_ssl]
   08:06:33                             [--publish_to_big_query 
PUBLISH_TO_BIG_QUERY]
   08:06:33                             [--metrics_dataset METRICS_DATASET]
   08:06:33                             [--metrics_table METRICS_TABLE]
   08:06:33                             [--influx_measurement 
INFLUX_MEASUREMENT]
   08:06:33                             [--influx_db_name INFLUX_DB_NAME]
   08:06:33                             [--influx_hostname INFLUX_HOSTNAME]
   08:06:33                             [--input_options INPUT_OPTIONS]
   08:06:33                             [--timeout_ms TIMEOUT_MS]
   08:06:33                             [--iterations ITERATIONS] [--fanout=1 
FANOUT=1]
   08:06:33 group_by_key_test.py: error: argument --fanout=1: expected one 
argument
   ```
   
   
   ### Issue Priority
   
   Priority: 1
   
   ### Issue Component
   
   Component: testing


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to