@ashb - the problem I ran into was that (at least for postgres) - the data was 
being read from a naive timestamp column in the db (pre 
0e2a74e0fc9f_add_time_zone_awareness.py and ) and so the python code was 
converting it to an assumed UTC date. The session.merge of the task instance 
then fails because (at least in postgres) the timestamp with timezone is passed 
into the query as a full string and so the key no longer matches the database 
and you get an error and the migration fails:
**'sqlalchemy.orm.exc.StaleDataError: UPDATE statement on table 'task_instance' 
expected to update 1 row(s); 0 were matched.'**

I don't think that the migration really needs the date - it just needs the 
primary key and so while for runtime code/etc - I agree 100% that the 
UtcDateTime should be used. However, I believe in this case since the database 
during this migration is storing naive dates AND we are only using the date as 
a value type - it should be safe  to keep naive in the migration code.

I'll see if I can test with mysql and confirm this doesn't cause any new errors.
  * install airflow 1.8 or 1.9
  * run a task (so task_instance table isn't empty)
  * upgrade to 1.10



[ Full content available at: 
https://github.com/apache/incubator-airflow/pull/3985 ]
This message was relayed via gitbox.apache.org for [email protected]

Reply via email to