This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new 58b3771bf0 Add documentation for FAB DB commands (#42352)
58b3771bf0 is described below

commit 58b3771bf04634de3a6b0ac9db9bc3a99776ed3d
Author: Ephraim Anierobi <[email protected]>
AuthorDate: Fri Sep 20 17:19:02 2024 +0100

    Add documentation for FAB DB commands (#42352)
    
    * Add documentation for FAB DB commands
    
    We recently separated FAB migration from Airflow's migration and added some 
commands
    for handling DB upgrade and downgrade in FAB. This PR adds user-facing 
documentation
    on how to upgrade and downgrade FAB and information about the different 
commands.
    
    I also added documentation on how contributors can hook their application's 
migration
    into Airflow's migration.
    
    The update_migration_references pre-commit was also updated to include FAB 
migrations.
    
    * fixup! Add documentation for FAB DB commands
    
    * fixup! fixup! Add documentation for FAB DB commands
---
 contributing-docs/13_metadata_database_updates.rst | 37 +++++++++++++
 docs/apache-airflow-providers-fab/index.rst        |  7 +++
 .../migrations-ref.rst                             | 49 +++++++++++++++++
 docs/apache-airflow-providers-fab/upgrading.rst    | 63 ++++++++++++++++++++++
 scripts/in_container/run_migration_reference.py    | 23 ++++----
 5 files changed, 170 insertions(+), 9 deletions(-)

diff --git a/contributing-docs/13_metadata_database_updates.rst 
b/contributing-docs/13_metadata_database_updates.rst
index 7fd702dbdf..e7ff6428d8 100644
--- a/contributing-docs/13_metadata_database_updates.rst
+++ b/contributing-docs/13_metadata_database_updates.rst
@@ -49,6 +49,43 @@ After your new migration file is run through pre-commit it 
will look like this:
 
 This represents that your migration is the 1234th migration and expected for 
release in Airflow version A.B.C.
 
+How to hook your application into Airflow's migration process
+-------------------------------------------------------------
+
+Airflow 3.0.0 introduces a new feature that allows you to hook your 
application into Airflow's migration process.
+This feature is useful if you have a custom database schema that you want to 
migrate along with Airflow's schema.
+This guide will show you how to hook your application into Airflow's migration 
process.
+
+Subclass the BaseDBManager
+==========================
+To hook your application into Airflow's migration process, you need to 
subclass the BaseDBManager class from the
+airflow.utils.db_manager.py module. This class provides methods for running 
Alembic migrations.
+
+Create Alembic migration scripts
+================================
+At the root of your application, run "alembic init migrations" to create a new 
migrations directory. Set
+``version_table`` variable in the ``env.py`` file to the name of the table 
that stores the migration history. Specify this
+version_table in the ``version_table`` argument of the alembic's 
``context.configure`` method of the ``run_migration_online``
+and ``run_migration_offline`` functions. This will ensure that your 
application's migrations are stored in a separate
+table from Airflow's migrations.
+
+Next, define an ``include_object`` function in the ``env.py`` that ensures 
that only your application's metadata is included in the application's
+migrations. This too should be specified in the ``context.configure`` method 
of the ``run_migration_online`` and ``run_migration_offline``.
+
+Next, set the config_file not to disable existing loggers:
+
+```python
+if config.config_file_name is not None:
+    fileConfig(config.config_file_name, disable_existing_loggers=False)
+```
+
+Replace the content of your application's ``alembic.ini`` file with Airflow's 
``alembic.ini`` copy.
+
+If the above is not clear, you might want to look at the FAB implementation of 
this migration.
+
+After setting up those, and you want airflow to run the migration for you when 
running ``airflow db migrate`` then you need to
+add your DBManager to the ``[core] external_db_managers`` configuration.
+
 --------
 
 You can also learn how to setup your `Node environment 
<14_node_environment_setup.rst>`__ if you want to develop Airflow UI.
diff --git a/docs/apache-airflow-providers-fab/index.rst 
b/docs/apache-airflow-providers-fab/index.rst
index 40b41e12f8..65b29a95ce 100644
--- a/docs/apache-airflow-providers-fab/index.rst
+++ b/docs/apache-airflow-providers-fab/index.rst
@@ -35,6 +35,13 @@
     :caption: Guides
 
     Auth manager <auth-manager/index>
+    Upgrading <upgrading>
+
+.. toctree::
+    :hidden:
+    :caption: Internal DB details
+
+    Database Migrations <migrations-ref>
 
 .. toctree::
     :hidden:
diff --git a/docs/apache-airflow-providers-fab/migrations-ref.rst 
b/docs/apache-airflow-providers-fab/migrations-ref.rst
new file mode 100644
index 0000000000..49c6df64e0
--- /dev/null
+++ b/docs/apache-airflow-providers-fab/migrations-ref.rst
@@ -0,0 +1,49 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Reference for Database Migrations
+'''''''''''''''''''''''''''''''''
+
+Here's the list of all the Database Migrations that are executed via when you 
run ``airflow fab-db migrate``.
+
+.. warning::
+
+   Those migration details are mostly used here to make the users aware when 
and what kind of migrations
+   will be executed during migrations between specific Airflow versions. The 
intention here is that the
+   "DB conscious" users might perform an analysis on the migrations and draw 
conclusions about the impact
+   of the migrations on their Airflow database. Those users might also want to 
take a look at the
+   :doc:`apache-airflow:database-erd-ref` document to understand how the 
internal DB of Airflow structure looks like.
+   However, you should be aware that the structure is internal and you should 
not access the DB directly
+   to retrieve or modify any data - you should use the :doc:`REST API 
<stable-rest-api-ref>` to do that instead.
+
+
+
+ .. This table is automatically updated by pre-commit by 
``scripts/ci/pre_commit/migration_reference.py``
+ .. All table elements are scraped from migration files
+ .. Beginning of auto-generated table
+
++-------------------------+--------------+---------------+------------------------+
+| Revision ID             | Revises ID   | Fab Version   | Description         
   |
++=========================+==============+===============+========================+
+| ``6709f7a774b9`` (head) | ``None``     | ``1.3.0``     | placeholder 
migration. |
++-------------------------+--------------+---------------+------------------------+
+
+ .. End of auto-generated table
+
+.. spelling:word-list::
+    branchpoint
+    mergepoint
diff --git a/docs/apache-airflow-providers-fab/upgrading.rst 
b/docs/apache-airflow-providers-fab/upgrading.rst
new file mode 100644
index 0000000000..28c046d9c0
--- /dev/null
+++ b/docs/apache-airflow-providers-fab/upgrading.rst
@@ -0,0 +1,63 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Upgrading FAB to a newer version
+--------------------------------
+Before reading this, make sure you have read the Airflow Upgrade Guide for how 
to prepare for an upgrade:
+:doc:`apache-airflow:installation/upgrading`
+
+Why you need to upgrade
+=======================
+The FAB provider is a separate package from Airflow and it is released 
independently. Starting from version 1.3.0, FAB
+can now run its own migrations if you are on Airflow 3. Newer FAB versions can 
contain database migrations, so you
+must run ``airflow fab-db migrate`` to migrate your database with the schema 
changes in the FAB version you are
+upgrading to. If ``FABDBManager`` is included in the ``[core] 
external_db_managers`` configuration, the migrations will
+be run automatically as part of the ``airflow db migrate`` command.
+
+How to upgrade
+==============
+To upgrade the FAB provider, you need to install the new version of the 
package. You can do this using ``pip``.
+After the installation, you can run the DB upgrade of the FAB provider by 
running the following command:
+``airflow fab-db migrate``. This command is only available if you are in 
Airflow 3.0.0 or newer.
+
+The command takes the same options as ``airflow db migrate`` command, you can 
learn more about the command by
+running ``airflow fab-db migrate --help``.
+
+How to downgrade
+================
+If you need to downgrade the FAB provider, you can do this by running the 
downgrade command to the version you want to
+downgrade to, example ``airflow fab-db downgrade --to-version 1.2.0``. 
Afterwards, install the new FAB provider version
+using ``pip``.
+
+There are other options to this command, check it out by running ``airflow 
fab-db downgrade -help``.
+
+Resetting the FAB database
+==========================
+If you need to reset the FAB database, you can do this by running the reset 
command, example ``airflow fab-db reset``.
+This command will drop all tables in the FAB database and recreate them. This 
command is only available if you are in
+Airflow 3.0.0 or newer. There are other options to this command, check it out 
by running ``airflow fab-db reset --help``.
+
+Offline SQL migration scripts
+=============================
+If you want to run the upgrade script offline, you can use the ``-s`` or 
``--show-sql-only`` flag
+to get the SQL statements that would be executed. You may also specify the 
starting FAB version with the
+``--from-version`` flag and the ending FAB version with the ``-n`` or 
``--to-version`` flag.
+This feature is supported in Postgres and MySQL.
+
+Sample usage for Airflow version 2.7.0 or greater:
+   ``airflow fab-db migrate -s --from-version "1.3.0" -n "1.4.0"``
+   ``airflow fab-db migrate --show-sql-only --from-version "1.3.0" 
--to-version "1.4.0"``
diff --git a/scripts/in_container/run_migration_reference.py 
b/scripts/in_container/run_migration_reference.py
index fc7d3bd084..a819c685e7 100755
--- a/scripts/in_container/run_migration_reference.py
+++ b/scripts/in_container/run_migration_reference.py
@@ -61,7 +61,7 @@ def wrap_backticks(val):
     return ",\n".join(map(_wrap_backticks, val)) if isinstance(val, (tuple, 
list)) else _wrap_backticks(val)
 
 
-def update_doc(file, data):
+def update_doc(file, data, app):
     replace_text_between(
         file=file,
         start=" .. Beginning of auto-generated table\n",
@@ -71,7 +71,7 @@ def update_doc(file, data):
             headers={
                 "revision": "Revision ID",
                 "down_revision": "Revises ID",
-                "version": "Airflow Version",
+                "version": f"{app.title()} Version",
                 "description": "Description",
             },
             tabular_data=data,
@@ -142,21 +142,27 @@ def get_revisions(app="airflow") -> Iterable[Script]:
         yield from script.walk_revisions()
 
 
-def update_docs(revisions: Iterable[Script]):
+def update_docs(revisions: Iterable[Script], app="airflow"):
     doc_data = []
     for rev in revisions:
+        app_revision = rev.module.airflow_version if app == "airflow" else 
rev.module.fab_version
         doc_data.append(
             dict(
                 revision=wrap_backticks(rev.revision) + revision_suffix(rev),
                 down_revision=wrap_backticks(rev.down_revision),
-                version=wrap_backticks(rev.module.airflow_version),  # type: 
ignore
+                version=wrap_backticks(app_revision),  # type: ignore
                 description="\n".join(textwrap.wrap(rev.doc, width=60)),
             )
         )
+    if app == "fab":
+        filepath = project_root / "docs" / "apache-airflow-providers-fab" / 
"migrations-ref.rst"
+    else:
+        filepath = project_root / "docs" / "apache-airflow" / 
"migrations-ref.rst"
 
     update_doc(
-        file=project_root / "docs" / "apache-airflow" / "migrations-ref.rst",
+        file=filepath,
         data=doc_data,
+        app=app,
     )
 
 
@@ -243,7 +249,6 @@ if __name__ == "__main__":
         console.print("[bright_blue]Making sure filenames are sorted")
         ensure_filenames_are_sorted(revisions=revisions, app=app)
         revisions = list(get_revisions(app=app))
-        if app == "airflow":
-            console.print("[bright_blue]Updating documentation")
-            update_docs(revisions=revisions)
-            console.print("[green]Migrations OK")
+        console.print("[bright_blue]Updating documentation")
+        update_docs(revisions=revisions, app=app)
+        console.print("[green]Migrations OK")

Reply via email to