EricGao888 commented on issue #19286:
URL: https://github.com/apache/airflow/issues/19286#issuecomment-986380136


   > BTW, for backwards compatibility reasons, both the `DagRun` column and 
`create_dagrun` argument need to be optional (None by default), and if it’s 
None, we’ll fall back to `data_interval_end` like we currently use (for lack of 
a better value).
   
   After I added the `run_after` field to `dagrun` and changed the related 
logic to calculate the scheduling delay metric, I got blocked by the following 
error when initializing db:
   
   `(airflow_dev) ➜  ~ airflow standalone
   standalone | Starting Airflow Standalone
   standalone | Checking database is initialized
   INFO  [alembic.runtime.migration] Context impl SQLiteImpl.
   INFO  [alembic.runtime.migration] Will assume non-transactional DDL.
   WARNI [airflow.models.crypto] empty cryptography key - values will not be 
stored encrypted.
   Traceback (most recent call last):
     File 
"/opt/anaconda3/envs/airflow_dev/lib/python3.6/site-packages/sqlalchemy/engine/base.py",
 line 1277, in _execute_context
       cursor, statement, parameters, context
     File 
"/opt/anaconda3/envs/airflow_dev/lib/python3.6/site-packages/sqlalchemy/engine/default.py",
 line 608, in do_execute
       cursor.execute(statement, parameters)
   sqlite3.OperationalError: no such column: dag_run.run_after`
   
   It seems somehow the new field was not mapped to a column when creating the 
table. May I ask which part of the code should I look into to fix this? Any 
help will be appreciated!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to