athenawisdoms opened a new issue #8905:
URL: https://github.com/apache/airflow/issues/8905


   
   I am using Apache Airflow 1.10.10 to run several Python scripts in a DAG via 
the `BashOperator`. The logs are currently written to `/usr/airflow/logs`.
   
   Is it possible to configure Airflow to 
   
   1. *also* write the logs to another directory like `/home/airflow/logs`
   2. The logs should only contain the stdout from the python scripts
   3. The logs should be stored in the following directory/filename format:
   
           /home/airflow/logs/[execution-date]-[dag-id]-[task-id].log
   4. Retries should be appended to the same `.log` file, if possible. 
Otherwise, we can have the naming convention:
   
           
/home/airflow/logs/[execution-date]-[dag-id]-[task-id]-[retry-number].log
   
   Thanks everyone!
   
   **Example DAG**
   
   ```
   from airflow import DAG
   from airflow.operators.bash_operator import BashOperator
   
   default_args = { ... }
   
   dag = DAG(
       'mydag',
       default_args=default_args,
       schedule_interval='*/10 * * * *',
   )
   
   # Log to /home/foo/logs/2020-05-12-mydag-hello_world.log
   t1 = BashOperator(
       task_id='hello_world',
       bash_command='/path/to/env/bin/python /path/to/scripts/hello_world.py',
       dag=dag,
   )
   
   
   # Log to /home/foo/logs/2020-05-12-mydag-hey_there.log
   t2 = BashOperator(
       task_id='hey_there',
       bash_command='/path/to/env/bin/python /path/to/scripts/hey_there.py',
       dag=dag,
   )
   
   t1 >> t2
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to