marclamberti commented on issue #12334:
URL: https://github.com/apache/airflow/issues/12334#issuecomment-809362985
I have this issue with Airflow 2.0.1 and the Helm chart.
It happens since I enabled remote logging with S3 where the credentials are
given through a Secret object.
```
mydagdownloading.09c4d4061ada4c058dd91ac4885dd76f 1/1 Running 0
32m
mydagdownloading.2bc28a8a155f4e29b234e1d153eef52e 1/1 Running 0
37m
mydagdownloading.f25cdd3178604b3f8b343f6bb13d7840 1/1 Running 0
42m
mydagdownloading.fcca60688a324a898014cb98bcbffbc2 1/1 Running 0
34m
mydagprocessing.09d820cdb1f14a919d0e9a7661b295cf 1/1 Running 0
34m
mydagprocessing.581978d0c2064f56bc6affebc499e489 1/1 Running 0
31m
mydagprocessing.5d4b18ff4a8f4af4b2acff1595e156a3 1/1 Running 0
42m
mydagprocessing.9efc24c0813a45789c93139c5ae4fc1d 1/1 Running 0
37m
mydagstoring.3a9e3635a6294e57b605fd52c465825a 1/1 Running 0
41m
mydagstoring.86ec9747f7824bcca684fe5e944ef1b4 1/1 Running 0
31m
mydagstoring.eabf0e43467848e7988fca4e8bf45267 1/1 Running 0
34m
mydagstoring.eb1533e230ed440e8b40c0c57b361a4b 1/1 Running 0
37m
```
And whenever I try to get the logs I get this message:
```
*** Failed to verify remote log exists
s3://marcl-airflow/airflow/logs/my_dag/downloading/2021-03-29T12:33:27.660837+00:00/1.log.
An error occurred (403) when calling the HeadObject operation: Forbidden
*** Falling back to local log
*** Trying to get logs (last 100 lines) from worker pod
mydagdownloading.09c4d4061ada4c058dd91ac4885dd76f ***
BACKEND=postgresql
DB_HOST=airflow-postgresql.airflow.svc.cluster.local
DB_PORT=5432
[2021-03-29 12:33:34,965] {settings.py:210} DEBUG - Setting up DB connection
pool (PID 8)
[2021-03-29 12:33:34,966] {settings.py:281} DEBUG -
settings.prepare_engine_args(): Using pool settings. pool_size=5,
max_overflow=10, pool_recycle=1800, pid=8
[2021-03-29 12:33:35,039] {cli_action_loggers.py:40} DEBUG - Adding
<function default_action_log at 0x7f91156a5400> to pre execution callback
[2021-03-29 12:33:37,012] {cli_action_loggers.py:66} DEBUG - Calling
callbacks: [<function default_action_log at 0x7f91156a5400>]
[2021-03-29 12:33:37,025] {settings.py:210} DEBUG - Setting up DB connection
pool (PID 8)
[2021-03-29 12:33:37,026] {settings.py:243} DEBUG -
settings.prepare_engine_args(): Using NullPool
[2021-03-29 12:33:37,027] {dagbag.py:448} INFO - Filling up the DagBag from
/opt/airflow/dags/repo/my_dag.py
[2021-03-29 12:33:37,027] {dagbag.py:287} DEBUG - Importing
/opt/airflow/dags/repo/my_dag.py
[2021-03-29 12:33:37,039] {dagbag.py:413} DEBUG - Loaded DAG <DAG: my_dag>
[2021-03-29 12:33:37,057] {plugins_manager.py:270} DEBUG - Loading plugins
[2021-03-29 12:33:37,057] {plugins_manager.py:207} DEBUG - Loading plugins
from directory: /opt/airflow/plugins
[2021-03-29 12:33:37,057] {plugins_manager.py:184} DEBUG - Loading plugins
from entrypoints
[2021-03-29 12:33:37,101] {plugins_manager.py:414} DEBUG - Integrate DAG
plugins
```
which I think is misleading as the user has all permissions on the
bucket/objects. I tried from the worker pod and I can access the bucket and put
files into it. But, the worker pods keep running and don't upload any log files.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]