jazzsir opened a new issue #19840:
URL: https://github.com/apache/airflow/issues/19840


   ### Apache Airflow version
   
   2.1.4
   
   ### Operating System
   
   CentOS 7.6
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==2.4.0
   apache-airflow-providers-celery==2.1.0
   apache-airflow-providers-cncf-kubernetes==2.1.0
   apache-airflow-providers-docker==2.3.0
   apache-airflow-providers-elasticsearch==2.1.0
   apache-airflow-providers-ftp==2.0.1
   apache-airflow-providers-grpc==2.0.1
   apache-airflow-providers-hashicorp==2.1.1
   apache-airflow-providers-http==2.0.1
   apache-airflow-providers-imap==2.0.1
   apache-airflow-providers-postgres==2.3.0
   apache-airflow-providers-redis==2.0.1
   apache-airflow-providers-sendgrid==2.0.1
   apache-airflow-providers-sftp==2.2.0
   apache-airflow-providers-slack==4.1.0
   apache-airflow-providers-sqlite==2.0.1
   apache-airflow-providers-ssh==2.3.0
   
   ### Deployment
   
   Official Apache Airflow Helm Chart
   
   ### Deployment details
   
   K8s: v1.16.7
   
   ### What happened
   
   I'm trying to run a `airflow dags backfill...`  command in airflow-webserver 
pod.
   At first, the airflow-webserver service account did not have permission, so 
I added it to the airflow-pod-launcher-rolebinding.
   But, I got the error below occurred in all pods created by backfill because 
there is no **dags volume** in the pod spec.
   
   ```
   [2021-11-26 12:51:03,594] {cli_action_loggers.py:105} WARNING - Failed to 
log action with (sqlite3.OperationalError) no such table: log
   [SQL: INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, 
extra) VALUES (?, ?, ?, ?, ?, ?, ?)]
   [parameters: ('2021-11-26 12:51:03.589011', 'example_bash_operator', 
'runme_0', 'cli_task_run', '2021-11-26 00:00:00.000000', 'airflow', 
'{"host_name": "examplebashoperatorrunme0.dee72370de6a4f54b6987c9624421200", 
"full_command": "[\'/home/airflow/.local/bin/airflow\', \'tasks\', \'run\ ... 
(85 characters truncated) ... depends-on-past\', \'--local\', \'--pool\', 
\'default_pool\', \'--subdir\', 
\'DAGS_FOLDER/airflow-dag-template.git/dags/example_bash_operator.py\']"}')]
   (Background on this error at: http://sqlalche.me/e/13/e3q8)
   [2021-11-26 12:51:03,595] {dagbag.py:496} INFO - Filling up the DagBag from 
/opt/airflow/dags/airflow-dag-template.git/dags/example_bash_operator.py
   Traceback (most recent call last):
     File "/home/airflow/.local/bin/airflow", line 8, in <module>
       sys.exit(main())
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/__main__.py", line 
40, in main
       args.func(args)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", 
line 48, in command
       return func(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/cli.py", line 
92, in wrapper
       return f(*args, **kwargs)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py",
 line 220, in task_run
       dag = get_dag(args.subdir, args.dag_id)
     File 
"/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/cli.py", line 
195, in get_dag
       'parse.'.format(dag_id)
   airflow.exceptions.AirflowException: dag_id could not be found: 
example_bash_operator. Either the dag did not exist or it failed to parse.
   ```
   
   ### What you expected to happen
   
   _No response_
   
   ### How to reproduce
   
   1. install airflow 2.1.4 helm chart with a default value
   2. run a backfill command in airflow-webserver pod
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to