v0ldemort opened a new issue #9017: URL: https://github.com/apache/airflow/issues/9017
Hi, I have build a docker images with this apache/airflow Dockerfile and running on AWS EKS cluster using this **https://github.com/helm/charts/tree/master/stable/airflow** helm chart. While I am running this docker on kubernetes, I am getting the following error, Most likely it is because of it is not creating airflow database in a PostgreSQL. '''[2020-05-26 09:46:23,114] {cli_action_loggers.py:105} WARNING - Failed to log action with (sqlite3.OperationalError) no such table: log [SQL: INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES (?, ?, ?, ?, ?, ?, ?)] [parameters: ('2020-05-26 09:46:23.063214', None, None, 'cli_webserver', None, 'airflow', '{"host_name": "8cd033b280df", "full_command": "[\'/home/airflow/.local/bin/airflow\' , \'webserver\']"}')] (Background on this error at: http://sqlalche.me/e/e3q8) ____________ _____________ ____ |__( )_________ __/__ /________ __ ____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / / ___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ / _/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/ [2020-05-26 09:46:23,675] {manager.py:710} WARNING - No user yet created, use flask fab command to do it. [2020-05-26 09:46:24,869] {dagbag.py:368} INFO - Filling up the DagBag from /dev/null [2020-05-26 09:46:27,165] {security.py:213} INFO - Initializing permissions for role:Viewer in the database. [2020-05-26 09:46:27,225] {security.py:213} INFO - Initializing permissions for role:User in the database. [2020-05-26 09:46:27,288] {security.py:213} INFO - Initializing permissions for role:Op in the database. Traceback (most recent call last): File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1284, in _execute_context cursor, statement, parameters, context File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 590, in do_execute cursor.execute(statement, parameters) sqlite3.OperationalError: no such table: dag The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/airflow/.local/bin/airflow", line 10, in <module> sys.exit(main()) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/__main__.py", line 40, in main args.func(args) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/cli_parser.py", line 52, in command return func(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 84, in wrapper return f(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/commands/webserver_command.py", line 205, in webserver app = cached_app(None) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/app.py", line 304, in cached_app app, _ = create_app(config=config, testing=testing) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/app.py", line 218, in create_app security_manager.sync_roles() File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/security.py", line 507, in sync_roles self.create_custom_dag_permission_view() File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/session.py", line 61, in wrapper return func(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/security.py", line 422, in create_custom_dag_permission_view .filter(or_(models.DagModel.is_active, models.DagModel.is_paused)).all() File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3319, in all return list(self) File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3481, in __iter__ return self._execute_and_instances(context) File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/orm/query.py", line 3506, in _execute_and_instances result = conn.execute(querycontext.statement, self._params) File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1020, in execute return meth(self, multiparams, params) File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection return connection._execute_clauseelement(self, multiparams, params) File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1139, in _execute_clauseelement distilled_params, File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1324, in _execute_context e, statement, parameters, cursor, context File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1518, in _handle_dbapi_exception sqlalchemy_exception, with_traceback=exc_info[2], from_=e File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 178, in raise_ raise exception File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1284, in _execute_context cursor, statement, parameters, context File "/home/airflow/.local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 590, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such table: dag [SQL: SELECT dag.dag_id AS dag_dag_id, dag.root_dag_id AS dag_root_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag. last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_ id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners, dag.description AS dag_description, dag.default_view AS dag_default_view, dag.schedule_interval AS d ag_schedule_interval FROM dag WHERE dag.is_active = 1 OR dag.is_paused = 1] (Background on this error at: http://sqlalche.me/e/e3q8)''' ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
