ConverJens commented on pull request #13723:
URL: https://github.com/apache/beam/pull/13723#issuecomment-766724417


   @dandy10 
   I added an additional printout to the options.get_all_options() and all the 
s3 args are indeed empty:
   
   ```
   INFO:apache_beam.runners.portability.local_job_service:Worker: severity: 
INFO timestamp {   seconds: 1611570854   nanos: 805634498 } message: 
"{\'runner\': None, \'streaming\': False, \'beam_services\': {}, 
\'type_check_strictness\': \'DEFAULT_TO_ANY\', \'type_check_additional\': \'\', 
\'pipeline_type_check\': True, \'runtime_type_check\': False, 
\'performance_runtime_type_check\': False, 
\'direct_runner_use_stacked_bundle\': True, \'direct_runner_bundle_repeat\': 0, 
\'direct_num_workers\': 1, \'direct_running_mode\': \'in_memory\', 
\'dataflow_endpoint\': \'
   https://dataflow.googleapis.com
   \', \'project\': None, \'job_name\': None, \'staging_location\': None, 
\'temp_location\': None, \'region\': None, \'service_account_email\': None, 
\'no_auth\': False, \'template_location\': None, \'labels\': None, \'update\': 
False, \'transform_name_mapping\': None, \'enable_streaming_engine\': False, 
\'dataflow_kms_key\': None, \'flexrs_goal\': None, \'hdfs_host\': None, 
\'hdfs_port\': None, \'hdfs_user\': None, \'hdfs_full_urls\': False, 
\'num_workers\': None, \'max_num_workers\': None, \'autoscaling_algorithm\': 
None, \'machine_type\': None, \'disk_size_gb\': None, \'disk_type\': None, 
\'worker_region\': None, \'worker_zone\': None, \'zone\': None, \'network\': 
None, \'subnetwork\': None, \'worker_harness_container_image\': None, 
\'sdk_harness_container_image_overrides\': None, \'use_public_ips\': None, 
\'min_cpu_platform\': None, \'dataflow_worker_jar\': None, 
\'dataflow_job_file\': None, \'experiments\': None, 
\'number_of_worker_harness_threads\': None, \'profile_cpu\': False
 , \'profile_memory\': False, \'profile_location\': None, 
\'profile_sample_rate\': 1.0, \'requirements_file\': None, 
\'requirements_cache\': None, \'setup_file\': None, \'beam_plugins\': None, 
\'save_main_session\': False, \'sdk_location\': \'default\', 
\'extra_packages\': None, \'prebuild_sdk_container_engine\': None, 
\'prebuild_sdk_container_base_image\': None, \'docker_registry_push_url\': 
None, \'job_endpoint\': None, \'artifact_endpoint\': None, 
\'job_server_timeout\': 60, \'environment_type\': None, \'environment_config\': 
None, \'environment_options\': None, \'sdk_worker_parallelism\': 1, 
\'environment_cache_millis\': 0, \'output_executable_path\': None, 
\'artifacts_dir\': None, \'job_port\': 0, \'artifact_port\': 0, 
\'expansion_port\': 0, \'job_server_java_launcher\': \'java\', 
\'job_server_jvm_properties\': [], \'flink_master\': \'[auto]\', 
\'flink_version\': \'1.10\', \'flink_job_server_jar\': None, 
\'flink_submit_uber_jar\': False, \'spark_master_url\': \'local[4]\', \'spa
 rk_job_server_jar\': None, \'spark_submit_uber_jar\': False, 
\'spark_rest_url\': None, \'on_success_matcher\': None, \'dry_run\': False, 
\'wait_until_finish_duration\': None, \'pubsubRootUrl\': None, 
\'s3_access_key_id\': None, \'s3_secret_access_key\': None, 
\'s3_session_token\': None, \'s3_endpoint_url\': None, \'s3_region_name\': 
None, \'s3_api_version\': None, \'s3_verify\': None, \'s3_disable_ssl\': 
False}" instruction_id: "bundle_31" transform_id: 
"WriteSplit[eval]/Write/Write/WriteImpl/WriteBundles" log_location: 
"/usr/local/lib/python3.7/dist-packages/apache_beam/io/aws/s3io.py:70" thread: 
"Thread-14" 
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to