marclamberti commented on issue #8212:
URL: https://github.com/apache/airflow/issues/8212#issuecomment-629870279


   @TRReeve I didn’t need to do that. Actually, if you want to have the remote 
logging working with the kubernetes executor, you have to define additional 
kubernetes environment variables so that your PODs are in sync with the 
scheduler/web server. Eg:
   AIRFLOW__CORE__REMOTE_LOGGING=True
   AIRFLOW__CORE__REMOTE_BASE_LOGS_FOLDER=s3://my-bucket/my-key
   AIRFLOW__CORE__REMOTE_LOG_CONN_ID=myawsconn
   AIRFLOW__CORE__FERNET_KEY=myfernetkey
   
   
AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABALES__AIRFLOW__CORE__REMOTE_LOGGING=True
   
AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABALES__AIRFLOW__CORE__REMOTE_BASE_LOGS_FOLDER=s3://my-bucket/my-key
   
AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABALES__AIRFLOW__CORE__REMOTE_LOG_CONN_ID=myawsconn
   
AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABALES__AIRFLOW__CORE__FERNET_KEY=myfernetkey
   
   The connection myawsconn with type S3 and extra field
   - aws_access_key_id
   - aws_secret_acess_key_id
   - region_name
   
   Also I didn’t need to change the task handler as it is automatically changed 
when remote logging is set to True, and I didn’t have to define a custom 
logging class.
   
   Hope it helps :)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to