Yeah, we're first shutting down all airflow components and only then perform the upgrade. As another safety option you could take a dump of your meta database. Easy with postgres and pg_dump. And as Jarek said, the alembic migrations usually work perfectly. I onlyt remember a single release (2.3.0...?) with minor issues depending on the upgrade path. Might even have been on me, not sure. But especially if you're new to python deployments, you might want to do take every safety measure you can. If your vm environment permits, take a snapshot of the airflow vm itself. Btw. even when everything works out of the box (which it will when following the docs), and you still want to rollback, just use alembic for downgrading to the old db schema.
Regarding config files - we just replace them with new ones. But we're using ansible for deployments, and the playbooks are git versioned, so we can always go back in time without any trouble. I hope that has given you some pointers. Take care, Lars On 9 November 2023 22:51:23 CET, Ben Hancock <bhanc...@alm.com.INVALID> wrote: >Jarek writes: > >> Yes - what Lars wrote about Python changing. This is nothing >> "airflow" specific - you cannot just "upgrade" installed packages >> with python - but that's basic python stuff. > >Thank you Jarek and Lars, this is very helpful. One question I still >have that I think /is/ Airflow-specific is about the behavior of the >`airflow db upgrade` command. > >Will running that command *after* upgrading to the latest version of >Airflow handle any necessary schema updates, regardless of which >previous version was installed? > >I can't quite tell from the docs whether I should be explicitly passing >the versions, for example upgrading from 2.6.3 to the latest: > > $ airflow db upgrade -r "2.6.3:2.7.3" > >... or if that is handled automatically (provided my machine is not >offline). That's what this seems to suggest: > >https://airflow.apache.org/docs/apache-airflow/2.3.0/installation/upgrading.html#offline-sql-migration-scripts > >Lars writes: > >> We're also running in vms, and we create a new env for every upgrade, >> so we can easily rollback if the pip installation fails e.g. because >> of custom dependencies and the like. > >That seems like a nice approach. I would assume this means you shut down >the web server and scheduler in the existing venv, and then point the >installation in the new venv at your existing metadata db. Is that the >idea? Do you copy over your existing airflow.cfg, and then modify as needed? > >Thank you again! > >- Ben > >--------------------------------------------------------------------- >To unsubscribe, e-mail: users-unsubscr...@airflow.apache.org >For additional commands, e-mail: users-h...@airflow.apache.org >