kakarukeys opened a new issue #15415:
URL: https://github.com/apache/airflow/issues/15415


   **Apache Airflow version**: 2.0.1
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: on my laptop
   - **OS** (e.g. from /etc/os-release): MacOS Majave 10.14.6
   - **Kernel** (e.g. `uname -a`): Darwin Wongs-MBP 18.7.0 Darwin Kernel 
Version 18.7.0: Tue Jan 12 22:04:47 PST 2021; 
root:xnu-4903.278.56~1/RELEASE_X86_64 x86_64
   
   **What happened**:
   
   configured remote logging to S3 bucket, only the logs of DAG runs appeared 
in the bucket.
   logs of airflow server components: scheduler, web server, etc did appear
   
   **What you expected to happen**:
   
   all logs go to S3 bucket
   
   **How to reproduce it**:
   
   1. follow the quick start guide in 
https://airflow.apache.org/docs/apache-airflow/stable/start/local.html
   
   2. before starting web server set the following variables:
   
   ```sh
   export AIRFLOW__LOGGING__REMOTE_LOGGING=True
   export AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER=s3://my-bucket/
   export AIRFLOW__LOGGING__REMOTE_LOG_CONN_ID=my_remote_logging_conn_id
   ```
   
   3. start the web server and set your S3 connection settings in the web 
server "connections" section.
   
   ```
   Conn Id * my_remote_logging_conn_id
   Conn Type  S3
   Extra {"region_name": "nyc3",
    "host": "https://nyc3.digitaloceanspaces.com";,
    "aws_access_key_id": "xxx",
    "aws_secret_access_key": "xxx"}
   ```
   
   4. Restart the web server
   5. Start the scheduler in another console window (setting the same env 
variables)
   6. Execute a DAG
   7. Head to your S3 bucket UI, you will see only logs of DAG runs appear.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to