Hey Larry, are you using the `KubernetesExecutor`? We support ES logging
for our clients and work with Local, Celery and Kubernetes executors. I
took a look through our helm charts to see if anything jumped out.
Wondering if you may need to pass this extra configuration to the executor
pods
https://github.com/astronomer/helm.astronomer.io/blob/master/charts/airflow/templates/configmap.yaml#L78

Possible that without that configuration set, it may skip logger
configuration here:
https://github.com/apache/airflow/blob/d5fa17f7b969eab6fd2af731bc63e5e6e90d56cb/airflow/config_templates/airflow_local_settings.py#L200

On Thu, Jan 9, 2020 at 2:25 AM Larry Zhu <larry....@oracle.com> wrote:

> I am using 1.10.6 and here are my log configurations for running airflow
> on kubernetes. I set up the kubenets to send all the console output logs to
> elasticsearch and I am trying to configure airflow worker to write logs to
> console. And it does not seem to work. I can see the local logs in the pod,
> but the task instance logs are not getting written to console therefore my
> filebeat daemon set cannot pick up the logs. Can you please help to shed
> lights to this?
>
>
> airflow:
>   config:
>     AIRFLOW__CORE__REMOTE_LOGGING: "True"
>     # HTTP_PROXY: "http://proxy.mycompany.com:123 
> <https://slack-redir.net/link?url=http%3A%2F%2Fproxy.mycompany.com%3A123>"
>     AIRFLOW__ELASTICSEARCH__LOG_ID_TEMPLATE: 
> "{{dag_id}}-{{task_id}}-{{execution_date}}-{{try_number}}"
>     AIRFLOW__ELASTICSEARCH__END_OF_LOG_MARK: "end_of_log"
>     AIRFLOW__ELASTICSEARCH__WRITE_STDOUT: "True"
>     AIRFLOW__ELASTICSEARCH__JSON_FORMAT: "True"
>     AIRFLOW__ELASTICSEARCH__JSON_FIELDS: "asctime, filename, lineno, 
> levelname, message"
>
>
>

-- 
*Greg Neiheisel* / CTO Astronomer.io

Reply via email to