sid-habu commented on issue #28365:
URL: https://github.com/apache/airflow/issues/28365#issuecomment-1352402832

   > > Just for confirm.
   > > 
   > > 1. You provide AWS Access Key ID, AWS Secret Access Key and role_arn in 
extra? Because You not masked as `xxxxx` for access key and secret key, so I 
assume that might be option that you retrieve initial credentials as IAM 
Profile or somewhere else.
   > > 2. Are log files created in S3?
   > > 3. Does this issue persist for newly created logs? After you upgrade to 
6.1.0
   > > 4. Did you also upgrade amazon-provider on webserver? Just ask because 
logs retrieved from webserver, not scheduler.
   > > 5. Could you try to check your credentials by this snippet: 
https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/connections/aws.html#snippet-to-create-connection-and-convert-to-uri
 and check is it printed correct role.
   > 
   > 1. The Access Key and Secret are provided in the Connection UI
   > 2. Yes, I see the log files getting generated in the S3
   > 3. Yes
   > 4. This might be the issue! You are right as `pip freeze` in webserver is 
still pointing to `6.0.0`
   > 
   > ```
   > apache-airflow-providers-amazon==6.0.0
   > apache-airflow-providers-celery==3.0.0
   > apache-airflow-providers-cncf-kubernetes==4.4.0
   > apache-airflow-providers-common-sql==1.2.0
   > apache-airflow-providers-docker==3.2.0
   > apache-airflow-providers-elasticsearch==4.2.1
   > apache-airflow-providers-ftp==3.1.0
   > apache-airflow-providers-google==8.4.0
   > apache-airflow-providers-grpc==3.0.0
   > apache-airflow-providers-hashicorp==3.1.0
   > apache-airflow-providers-http==4.0.0
   > apache-airflow-providers-imap==3.0.0
   > apache-airflow-providers-microsoft-azure==4.3.0
   > apache-airflow-providers-mysql==3.2.1
   > apache-airflow-providers-odbc==3.1.2
   > apache-airflow-providers-postgres==5.2.2
   > apache-airflow-providers-redis==3.0.0
   > apache-airflow-providers-sendgrid==3.0.0
   > apache-airflow-providers-sftp==4.1.0
   > apache-airflow-providers-slack==6.0.0
   > apache-airflow-providers-snowflake==3.3.0
   > apache-airflow-providers-sqlite==3.2.1
   > apache-airflow-providers-ssh==3.2.0
   > ```
   > 
   > 5. Yes, it's correct
   > 
   > Let me work on 4 above. Thank you so much for the insights
   
   Confirmed, it works after upgrading to 
`apache-airflow-providers-amazon==6.1.0` on both web server and scheduler. It 
might be a good idea to update 
`https://raw.githubusercontent.com/apache/airflow/constraints-2.3.4/constraints-3.7.txt`
 with `apache-airflow-providers-amazon==6.1.0` as `6.0.0` is broken for 
S3/Assume role 
   
   ```
   Reading remote log from 
s3://habu-stage-kedge-logs/dag_id=airflow-log-cleanup/run_id=scheduled__2022-12-14T00:00:00+00:00/task_id=log_cleanup_worker_num_1_dir_0/attempt=1.log.
   [2022-12-15, 00:01:24 UTC] {taskinstance.py:1165} INFO - Dependencies all 
met for <TaskInstance: airflow-log-cleanup.log_cleanup_worker_num_1_dir_0 
scheduled__2022-12-14T00:00:00+00:00 [queued]>
   [2022-12-15, 00:01:24 UTC] {taskinstance.py:1165} INFO - Dependencies all 
met for <TaskInstance: airflow-log-cleanup.log_cleanup_worker_num_1_dir_0 
scheduled__2022-12-14T00:00:00+00:00 [queued]>
   ```
   
   Thanks once again @Taragolis 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to