marcusianlevine opened a new issue #11479:
URL: https://github.com/apache/airflow/issues/11479


   **Apache Airflow version**: 1.10.12
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`): 1.16.9
   
   **Environment**: 
   
   - **Cloud provider or hardware configuration**: AWS
   - **OS** (e.g. from /etc/os-release): 
   - **Kernel** (e.g. `uname -a`): 
   - **Install tools**: Docker image running in k8s Pods
   - **Others**: Rancher-provisioned k8s clusters
   
   **What happened**:
   
   After configuring the latest version of the Elasticsearch backport provider 
as my log handler via `config/airflow_local_settings.py` resulted in an error 
on the webserver when trying to read logs from Elasticsearch
   
   ```
   [2020-10-12 21:02:00,487] {app.py:1892} ERROR - Exception on 
/get_logs_with_metadata [GET]
   Traceback (most recent call last):
     File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2447, in 
wsgi_app
       response = self.full_dispatch_request()
     File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1952, in 
full_dispatch_request
       rv = self.handle_user_exception(e)
     File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1821, in 
handle_user_exception
       reraise(exc_type, exc_value, tb)
     File "/usr/local/lib/python3.7/site-packages/flask/_compat.py", line 39, 
in reraise
       raise value
     File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1950, in 
full_dispatch_request
       rv = self.dispatch_request()
     File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1936, in 
dispatch_request
       return self.view_functions[rule.endpoint](**req.view_args)
     File 
"/usr/local/lib/python3.7/site-packages/airflow/www_rbac/decorators.py", line 
121, in wrapper
       return f(self, *args, **kwargs)
     File 
"/usr/local/lib/python3.7/site-packages/flask_appbuilder/security/decorators.py",
 line 109, in wraps
       return f(self, *args, **kwargs)
     File 
"/usr/local/lib/python3.7/site-packages/airflow/www_rbac/decorators.py", line 
56, in wrapper
       return f(*args, **kwargs)
     File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py", line 
74, in wrapper
       return func(*args, **kwargs)
     File "/usr/local/lib/python3.7/site-packages/airflow/www_rbac/views.py", 
line 733, in get_logs_with_metadata
       logs, metadata = _get_logs_with_metadata(try_number, metadata)
     File "/usr/local/lib/python3.7/site-packages/airflow/www_rbac/views.py", 
line 724, in _get_logs_with_metadata
       logs, metadatas = handler.read(ti, try_number, metadata=metadata)
     File 
"/usr/local/lib/python3.7/site-packages/airflow/utils/log/file_task_handler.py",
 line 194, in read
       logs[i] += log
   TypeError: can only concatenate str (not "list") to str
   ```
   
   Here is the relevant section of my customized `airflow_local_settings.py` 
file with the updated Elasticsearch handler from the backport provider:
   ```
   ...
       elif ELASTICSEARCH_HOST:
           ELASTICSEARCH_LOG_ID_TEMPLATE: str = conf.get('elasticsearch', 
'LOG_ID_TEMPLATE')
           ELASTICSEARCH_END_OF_LOG_MARK: str = conf.get('elasticsearch', 
'END_OF_LOG_MARK')
           ELASTICSEARCH_FRONTEND: str = conf.get('elasticsearch', 'frontend')
           ELASTICSEARCH_WRITE_STDOUT: bool = conf.getboolean('elasticsearch', 
'WRITE_STDOUT')
           ELASTICSEARCH_JSON_FORMAT: bool = conf.getboolean('elasticsearch', 
'JSON_FORMAT')
           ELASTICSEARCH_JSON_FIELDS: str = conf.get('elasticsearch', 
'JSON_FIELDS')
   
           ELASTIC_REMOTE_HANDLERS: Dict[str, Dict[str, Union[str, bool]]] = {
               'task': {
                   'class': 
'airflow.providers.elasticsearch.log.es_task_handler.ElasticsearchTaskHandler',
                   'formatter': 'airflow',
                   'base_log_folder': str(os.path.expanduser(BASE_LOG_FOLDER)),
                   'log_id_template': ELASTICSEARCH_LOG_ID_TEMPLATE,
                   'filename_template': FILENAME_TEMPLATE,
                   'end_of_log_mark': ELASTICSEARCH_END_OF_LOG_MARK,
                   'host': ELASTICSEARCH_HOST,
                   'frontend': ELASTICSEARCH_FRONTEND,
                   'write_stdout': ELASTICSEARCH_WRITE_STDOUT,
                   'json_format': ELASTICSEARCH_JSON_FORMAT,
                   'json_fields': ELASTICSEARCH_JSON_FIELDS
               },
           }
   
           LOGGING_CONFIG['handlers'].update(ELASTIC_REMOTE_HANDLERS)
   ...
   ```
   
   **What you expected to happen**:
   
   Airflow's web UI properly displays the logs from Elasticsearch
   
   **How to reproduce it**:
   Configure custom logging via `config/airflow_local_settings.py` to 
`airflow.providers.elasticsearch.log.es_task_handler.ElasticsearchTaskHandler` 
and set the `logging_config_class` in `airflow.cfg`
   
   When a task has been run, try to view its logs in the web UI and check the 
webserver logs to see the error above
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to