knutole opened a new issue #17681: URL: https://github.com/apache/airflow/issues/17681
<!-- Welcome to Apache Airflow! Please complete the next sections or the issue will be closed. --> **Apache Airflow version**: ``` Apache Airflow version | 2.1.2 executor | CeleryExecutor task_logging_handler | airflow.utils.log.file_task_handler.FileTaskHandler sql_alchemy_conn | postgresql+psycopg2://edidev:[email protected]:5432/airflow_backend dags_folder | /store/dags plugins_folder | /home/airflow/airflow/plugins base_log_folder | /store/logs remote_base_log_folder | ``` **OS**: ``` OS | Linux architecture | x86_64 uname | uname_result(system='Linux', node='7e21ed2c9829', release='5.4.0-1049-gcp', version='#53~18.04.1-Ubuntu SMP Thu Jul 15 11:32:10 UTC 2021', machine='x86_64', processor='x86_64') locale | (None, None) python_version | 3.6.9 (default, Jan 26 2021, 15:33:00) [GCC 8.4.0] python_location | /usr/bin/python3 ``` **Apache Airflow Provider versions**: ``` Providers info apache-airflow-providers-celery | 2.0.0 apache-airflow-providers-ftp | 2.0.0 apache-airflow-providers-imap | 2.0.0 apache-airflow-providers-postgres | 2.0.0 apache-airflow-providers-redis | 2.0.0 apache-airflow-providers-sqlite | 2.0.0 ``` **Deployment**: Docker Compose **What happened**: We are getting `No SecretsMasker found!` when trying to run DAGs. Could this be due to breaking changes in the configuration file? We have also tried setting `hide_sensitive_var_conn_fields = False` to no avail. ```bash [2021-08-18 11:05:53,690] {celery_executor.py:120} ERROR - Failed to execute task No SecretsMasker found!. Traceback (most recent call last): File "/usr/local/lib/python3.6/dist-packages/airflow/executors/celery_executor.py", line 117, in _execute_in_fork args.func(args) File "/usr/local/lib/python3.6/dist-packages/airflow/cli/cli_parser.py", line 48, in command return func(*args, **kwargs) File "/usr/local/lib/python3.6/dist-packages/airflow/utils/cli.py", line 91, in wrapper return f(*args, **kwargs) File "/usr/local/lib/python3.6/dist-packages/airflow/cli/commands/task_command.py", line 212, in task_run settings.configure_orm(disable_connection_pool=True) File "/usr/local/lib/python3.6/dist-packages/airflow/settings.py", line 224, in configure_orm mask_secret(engine.url.password) File "/usr/local/lib/python3.6/dist-packages/airflow/utils/log/secrets_masker.py", line 91, in mask_secret _secrets_masker().add_mask(secret, name) File "/usr/local/lib/python3.6/dist-packages/airflow/utils/log/secrets_masker.py", line 105, in _secrets_masker raise RuntimeError("No SecretsMasker found!") RuntimeError: No SecretsMasker found! [2021-08-18 11:05:53,710: ERROR/ForkPoolWorker-3] Task airflow.executors.celery_executor.execute_command[f6a9b0cd-bb0c-414a-a51c-80579f2d2f1e] raised unexpected: AirflowException('Celery command failed on host: 64c3bc97f173',) Traceback (most recent call last): File "/usr/local/lib/python3.6/dist-packages/celery/app/trace.py", line 412, in trace_task R = retval = fun(*args, **kwargs) File "/usr/local/lib/python3.6/dist-packages/celery/app/trace.py", line 704, in __protected_call__ return self.run(*args, **kwargs) File "/usr/local/lib/python3.6/dist-packages/airflow/executors/celery_executor.py", line 88, in execute_command _execute_in_fork(command_to_exec) File "/usr/local/lib/python3.6/dist-packages/airflow/executors/celery_executor.py", line 99, in _execute_in_fork raise AirflowException('Celery command failed on host: ' + get_hostname()) airflow.exceptions.AirflowException: Celery command failed on host: 64c3bc97f173 ``` The erroring line is 105 in `_secrets_masker.py`: ```python @cache def _secrets_masker() -> "SecretsMasker": for flt in logging.getLogger('airflow.task').filters: if isinstance(flt, SecretsMasker): return flt raise RuntimeError("No SecretsMasker found!") ``` This is our logging setup in our DAG: ```python log = logging.getLogger() log.setLevel(logging.DEBUG) stream_handler = logging.StreamHandler() log.addHandler(stream_handler) ``` **What you expected to happen**: We expect log files to be written and DAGs to run. **How to reproduce it**: See above. **Anything else we need to know**: Happens all the time to all DAGs. We have tried resetting, deleting DAGs, ensuring permissions on log folders, etc. Also ensured the log path is in fact available for Docker containers (mounted file stores, this has worked perfectly for 2+ years). **Are you willing to submit a PR?** If it can help, sure. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
