This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new 4fa4d1cc0c9 Update import in docs to use Task SDK objects (#48810)
4fa4d1cc0c9 is described below

commit 4fa4d1cc0c9c046c69ef4e0ba2088dd8674b4694
Author: Kaxil Naik <[email protected]>
AuthorDate: Sat Apr 5 03:28:15 2025 +0530

    Update import in docs to use Task SDK objects (#48810)
    
    We should re-do the docs but this makes it correct for now.
---
 .../logging-monitoring/callbacks.rst                   |  2 +-
 airflow-core/docs/authoring-and-scheduling/cron.rst    |  2 +-
 .../docs/authoring-and-scheduling/deferring.rst        |  2 +-
 .../authoring-and-scheduling/dynamic-task-mapping.rst  |  2 +-
 .../docs/authoring-and-scheduling/event-scheduling.rst |  4 ++--
 airflow-core/docs/best-practices.rst                   | 18 +++++++++---------
 airflow-core/docs/core-concepts/dag-run.rst            |  4 ++--
 airflow-core/docs/core-concepts/dags.rst               |  8 ++++----
 airflow-core/docs/core-concepts/operators.rst          |  2 +-
 airflow-core/docs/core-concepts/params.rst             |  2 +-
 airflow-core/docs/core-concepts/variables.rst          |  2 +-
 airflow-core/docs/faq.rst                              |  2 +-
 airflow-core/docs/howto/connection.rst                 |  4 ++--
 airflow-core/docs/howto/custom-operator.rst            |  2 +-
 airflow-core/docs/howto/define-extra-link.rst          |  9 +++------
 airflow-core/docs/howto/dynamic-dag-generation.rst     |  2 +-
 airflow-core/docs/howto/notifications.rst              |  2 +-
 airflow-core/docs/howto/timetable.rst                  |  4 ++--
 airflow-core/docs/howto/variable.rst                   |  2 +-
 airflow-core/docs/index.rst                            |  2 +-
 20 files changed, 37 insertions(+), 40 deletions(-)

diff --git 
a/airflow-core/docs/administration-and-deployment/logging-monitoring/callbacks.rst
 
b/airflow-core/docs/administration-and-deployment/logging-monitoring/callbacks.rst
index 417443192d0..377d06579cf 100644
--- 
a/airflow-core/docs/administration-and-deployment/logging-monitoring/callbacks.rst
+++ 
b/airflow-core/docs/administration-and-deployment/logging-monitoring/callbacks.rst
@@ -65,7 +65,7 @@ In the following example, failures in any task call the 
``task_failure_alert`` f
     import datetime
     import pendulum
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.providers.standard.operators.empty import EmptyOperator
 
 
diff --git a/airflow-core/docs/authoring-and-scheduling/cron.rst 
b/airflow-core/docs/authoring-and-scheduling/cron.rst
index c56bb72c1b3..4f594bfd256 100644
--- a/airflow-core/docs/authoring-and-scheduling/cron.rst
+++ b/airflow-core/docs/authoring-and-scheduling/cron.rst
@@ -24,7 +24,7 @@ or one of the :ref:`cron-presets`.
 
 .. code-block:: python
 
-    from airflow.models.dag import DAG
+    from airflow.sdk import DAG
 
     import datetime
 
diff --git a/airflow-core/docs/authoring-and-scheduling/deferring.rst 
b/airflow-core/docs/authoring-and-scheduling/deferring.rst
index a551d4fc59a..d7b78d73b3e 100644
--- a/airflow-core/docs/authoring-and-scheduling/deferring.rst
+++ b/airflow-core/docs/authoring-and-scheduling/deferring.rst
@@ -220,7 +220,7 @@ Below is an outline of how you can achieve this.
 
     import asyncio
 
-    from airflow.models.baseoperator import BaseOperator
+    from airflow.sdk import BaseOperator
     from airflow.triggers.base import BaseTrigger, TriggerEvent
 
 
diff --git 
a/airflow-core/docs/authoring-and-scheduling/dynamic-task-mapping.rst 
b/airflow-core/docs/authoring-and-scheduling/dynamic-task-mapping.rst
index 5a70316aba3..a9ead45d25a 100644
--- a/airflow-core/docs/authoring-and-scheduling/dynamic-task-mapping.rst
+++ b/airflow-core/docs/authoring-and-scheduling/dynamic-task-mapping.rst
@@ -253,7 +253,7 @@ In this example, you have a regular data delivery to an S3 
bucket and want to ap
 
     from datetime import datetime
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.decorators import task
     from airflow.providers.amazon.aws.hooks.s3 import S3Hook
     from airflow.providers.amazon.aws.operators.s3 import S3ListOperator
diff --git a/airflow-core/docs/authoring-and-scheduling/event-scheduling.rst 
b/airflow-core/docs/authoring-and-scheduling/event-scheduling.rst
index fd1ea7d3c39..1ee67fecc50 100644
--- a/airflow-core/docs/authoring-and-scheduling/event-scheduling.rst
+++ b/airflow-core/docs/authoring-and-scheduling/event-scheduling.rst
@@ -42,9 +42,9 @@ SQS:
 
 .. code-block:: python
 
-    from airflow.sdk.definitions.asset import Asset, AssetWatcher
+    from airflow.sdk import Asset, AssetWatcher
     from airflow.providers.common.msgq.triggers.msg_queue import 
MessageQueueTrigger
-    from airflow import DAG
+    from airflow.sdk import DAG
     from datetime import datetime
 
     # Define a trigger that listens to an external message queue (AWS SQS in 
this case)
diff --git a/airflow-core/docs/best-practices.rst 
b/airflow-core/docs/best-practices.rst
index f845eb41c76..80c79943730 100644
--- a/airflow-core/docs/best-practices.rst
+++ b/airflow-core/docs/best-practices.rst
@@ -124,7 +124,7 @@ Not avoiding top-level DAG code:
 
   import pendulum
 
-  from airflow import DAG
+  from airflow.sdk import DAG
   from airflow.decorators import task
 
 
@@ -153,7 +153,7 @@ Avoiding top-level DAG code:
 
   import pendulum
 
-  from airflow import DAG
+  from airflow.sdk import DAG
   from airflow.decorators import task
 
 
@@ -227,7 +227,7 @@ Imagine this code:
 
 .. code-block:: python
 
-  from airflow import DAG
+  from airflow.sdk import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   import pendulum
 
@@ -259,7 +259,7 @@ What you can do to check it is add some print statements to 
the code you want to
 
 .. code-block:: python
 
-  from airflow import DAG
+  from airflow.sdk import DAG
   from airflow.providers.standard.operators.python import PythonOperator
   import pendulum
 
@@ -408,7 +408,7 @@ Bad example:
 
 .. code-block:: python
 
-    from airflow.models import Variable
+    from airflow.sdk import Variable
 
     foo_var = Variable.get("foo")  # AVOID THAT
     bash_use_variable_bad_1 = BashOperator(
@@ -459,7 +459,7 @@ Bad example:
 
 .. code-block:: python
 
-    from airflow.models.variable import Variable
+    from airflow.sdk import Variable
     from airflow.timetables.interval import CronDataIntervalTimetable
 
 
@@ -472,7 +472,7 @@ Good example:
 
 .. code-block:: python
 
-    from airflow.models.variable import Variable
+    from airflow.sdk import Variable
     from airflow.timetables.interval import CronDataIntervalTimetable
 
 
@@ -535,7 +535,7 @@ It's easier to grab the concept with an example. Let's say 
that we have the foll
 
     from datetime import datetime
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.decorators import task
     from airflow.exceptions import AirflowException
     from airflow.providers.standard.operators.bash import BashOperator
@@ -781,7 +781,7 @@ This is an example test want to verify the structure of a 
code-generated DAG aga
     import pendulum
     import pytest
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.utils.state import DagRunState, TaskInstanceState
     from airflow.utils.types import DagRunTriggeredByType, DagRunType
 
diff --git a/airflow-core/docs/core-concepts/dag-run.rst 
b/airflow-core/docs/core-concepts/dag-run.rst
index 7882fc94c65..1befa0b7c13 100644
--- a/airflow-core/docs/core-concepts/dag-run.rst
+++ b/airflow-core/docs/core-concepts/dag-run.rst
@@ -101,7 +101,7 @@ then you will want to turn catchup off, which is the 
default setting or can be d
     
https://github.com/apache/airflow/blob/main/airflow/example_dags/tutorial.py
     """
 
-    from airflow.models.dag import DAG
+    from airflow.sdk import DAG
     from airflow.providers.standard.operators.bash import BashOperator
 
     import datetime
@@ -241,7 +241,7 @@ Example of a parameterized DAG:
 
     import pendulum
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.providers.standard.operators.bash import BashOperator
 
     dag = DAG(
diff --git a/airflow-core/docs/core-concepts/dags.rst 
b/airflow-core/docs/core-concepts/dags.rst
index 8f915df6b22..98419fb5c05 100644
--- a/airflow-core/docs/core-concepts/dags.rst
+++ b/airflow-core/docs/core-concepts/dags.rst
@@ -47,7 +47,7 @@ which will add anything inside it to the DAG implicitly:
 
     import datetime
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.providers.standard.operators.empty import EmptyOperator
 
     with DAG(
@@ -65,7 +65,7 @@ Or, you can use a standard constructor, passing the DAG into 
any operators you u
 
     import datetime
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.providers.standard.operators.empty import EmptyOperator
 
     my_dag = DAG(
@@ -459,7 +459,7 @@ You can also combine this with the 
:ref:`concepts:depends-on-past` functionality
         import pendulum
 
         from airflow.decorators import task
-        from airflow.models import DAG
+        from airflow.sdk import DAG
         from airflow.providers.standard.operators.empty import EmptyOperator
 
         dag = DAG(
@@ -577,7 +577,7 @@ TaskGroup also supports ``default_args`` like DAG, it will 
overwrite the ``defau
 
     import datetime
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.decorators import task_group
     from airflow.providers.standard.operators.bash import BashOperator
     from airflow.providers.standard.operators.empty import EmptyOperator
diff --git a/airflow-core/docs/core-concepts/operators.rst 
b/airflow-core/docs/core-concepts/operators.rst
index 24c1357955a..524a43e04a5 100644
--- a/airflow-core/docs/core-concepts/operators.rst
+++ b/airflow-core/docs/core-concepts/operators.rst
@@ -204,7 +204,7 @@ This approach disables the rendering of both macros and 
files and can be applied
 
 .. code-block:: python
 
-    from airflow.utils.template import literal
+    from airflow.sdk import literal
 
 
     fixed_print_script = BashOperator(
diff --git a/airflow-core/docs/core-concepts/params.rst 
b/airflow-core/docs/core-concepts/params.rst
index 636ff11b000..56847d7f863 100644
--- a/airflow-core/docs/core-concepts/params.rst
+++ b/airflow-core/docs/core-concepts/params.rst
@@ -38,7 +38,7 @@ Use a dictionary that maps Param names to either a 
:class:`~airflow.sdk.definiti
 .. code-block::
    :emphasize-lines: 7-10
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.decorators import task
     from airflow.sdk import Param
 
diff --git a/airflow-core/docs/core-concepts/variables.rst 
b/airflow-core/docs/core-concepts/variables.rst
index c05d900b5b9..9a4cc94c4f1 100644
--- a/airflow-core/docs/core-concepts/variables.rst
+++ b/airflow-core/docs/core-concepts/variables.rst
@@ -22,7 +22,7 @@ Variables are Airflow's runtime configuration concept - a 
general key/value stor
 
 To use them, just import and call ``get`` on the Variable model::
 
-    from airflow.models import Variable
+    from airflow.sdk import Variable
 
     # Normal call style
     foo = Variable.get("foo")
diff --git a/airflow-core/docs/faq.rst b/airflow-core/docs/faq.rst
index 16fb0c5b56d..d9d5bccca00 100644
--- a/airflow-core/docs/faq.rst
+++ b/airflow-core/docs/faq.rst
@@ -181,7 +181,7 @@ until ``min_file_process_interval`` is reached since DAG 
Parser will look for mo
    :caption: dag_loader.py
    :name: dag_loader.py
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.decorators import task
 
     import pendulum
diff --git a/airflow-core/docs/howto/connection.rst 
b/airflow-core/docs/howto/connection.rst
index 54037634d30..b4eb2d4d7c4 100644
--- a/airflow-core/docs/howto/connection.rst
+++ b/airflow-core/docs/howto/connection.rst
@@ -406,7 +406,7 @@ convenience method 
:py:meth:`~airflow.models.connection.Connection.get_uri`.  It
 .. code-block:: pycon
 
     >>> import json
-    >>> from airflow.models.connection import Connection
+    >>> from airflow.sdk import Connection
     >>> c = Connection(
     ...     conn_id="some_conn",
     ...     conn_type="mysql",
@@ -483,7 +483,7 @@ You can verify a URI is parsed correctly like so:
 
 .. code-block:: pycon
 
-    >>> from airflow.models.connection import Connection
+    >>> from airflow.sdk import Connection
 
     >>> c = 
Connection(uri="my-conn-type://my-login:my-password@my-host:5432/my-schema?param1=val1&param2=val2")
     >>> print(c.login)
diff --git a/airflow-core/docs/howto/custom-operator.rst 
b/airflow-core/docs/howto/custom-operator.rst
index f2b8db712df..b76a2277fbf 100644
--- a/airflow-core/docs/howto/custom-operator.rst
+++ b/airflow-core/docs/howto/custom-operator.rst
@@ -44,7 +44,7 @@ Let's implement an example ``HelloOperator`` in a new file 
``hello_operator.py``
 
 .. code-block:: python
 
-        from airflow.models.baseoperator import BaseOperator
+        from airflow.sdk import BaseOperator
 
 
         class HelloOperator(BaseOperator):
diff --git a/airflow-core/docs/howto/define-extra-link.rst 
b/airflow-core/docs/howto/define-extra-link.rst
index de1cccdf73d..0a1f1b04689 100644
--- a/airflow-core/docs/howto/define-extra-link.rst
+++ b/airflow-core/docs/howto/define-extra-link.rst
@@ -30,7 +30,7 @@ The following code shows how to add extra links to an 
operator via Plugins:
 
 .. code-block:: python
 
-    from airflow.models.baseoperator import BaseOperator
+    from airflow.sdk import BaseOperator
     from airflow.sdk import BaseOperatorLink
     from airflow.models.taskinstancekey import TaskInstanceKey
     from airflow.plugins_manager import AirflowPlugin
@@ -84,8 +84,7 @@ tasks using 
:class:`~airflow.providers.amazon.aws.transfers.gcs_to_s3.GCSToS3Ope
 
 .. code-block:: python
 
-  from airflow.models.baseoperator import BaseOperator
-  from airflow.sdk import BaseOperatorLink
+  from airflow.sdk import BaseOperator, BaseOperatorLink
   from airflow.models.taskinstancekey import TaskInstanceKey
   from airflow.plugins_manager import AirflowPlugin
   from airflow.providers.amazon.aws.transfers.gcs_to_s3 import GCSToS3Operator
@@ -127,10 +126,8 @@ Console, but if we wanted to change that link we could:
 
 .. code-block:: python
 
-    from airflow.models.baseoperator import BaseOperator
-    from airflow.sdk import BaseOperatorLink
+    from airflow.sdk import BaseOperator, BaseOperatorLink
     from airflow.models.taskinstancekey import TaskInstanceKey
-    from airflow.models.xcom import XCom
     from airflow.plugins_manager import AirflowPlugin
     from airflow.providers.google.cloud.operators.bigquery import 
BigQueryOperator
 
diff --git a/airflow-core/docs/howto/dynamic-dag-generation.rst 
b/airflow-core/docs/howto/dynamic-dag-generation.rst
index 7818afd9be6..5002e9f6830 100644
--- a/airflow-core/docs/howto/dynamic-dag-generation.rst
+++ b/airflow-core/docs/howto/dynamic-dag-generation.rst
@@ -206,7 +206,7 @@ of the context are set to ``None``.
 .. code-block:: python
   :emphasize-lines: 4,8,9
 
-  from airflow.models.dag import DAG
+  from airflow.sdk import DAG
   from airflow.sdk import get_parsing_context
 
   current_dag_id = get_parsing_context().dag_id
diff --git a/airflow-core/docs/howto/notifications.rst 
b/airflow-core/docs/howto/notifications.rst
index 0f53b20027d..38a545bb4d8 100644
--- a/airflow-core/docs/howto/notifications.rst
+++ b/airflow-core/docs/howto/notifications.rst
@@ -58,7 +58,7 @@ Here's an example of using the above notifier:
 
     from datetime import datetime
 
-    from airflow.models.dag import DAG
+    from airflow.sdk import DAG
     from airflow.providers.standard.operators.bash import BashOperator
 
     from myprovider.notifier import MyNotifier
diff --git a/airflow-core/docs/howto/timetable.rst 
b/airflow-core/docs/howto/timetable.rst
index 06d28a8626b..930409345f6 100644
--- a/airflow-core/docs/howto/timetable.rst
+++ b/airflow-core/docs/howto/timetable.rst
@@ -73,7 +73,7 @@ file:
 
     import pendulum
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.example_dags.plugins.workday import AfterWorkdayTimetable
 
 
@@ -194,7 +194,7 @@ For reference, here's our plugin and DAG files in their 
entirety:
 
     import pendulum
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.example_dags.plugins.workday import AfterWorkdayTimetable
     from airflow.providers.standard.operators.empty import EmptyOperator
 
diff --git a/airflow-core/docs/howto/variable.rst 
b/airflow-core/docs/howto/variable.rst
index 98af64811a7..e7d4c549324 100644
--- a/airflow-core/docs/howto/variable.rst
+++ b/airflow-core/docs/howto/variable.rst
@@ -52,7 +52,7 @@ You can use them in your dags as:
 
 .. code-block:: python
 
-    from airflow.models import Variable
+    from airflow.sdk import Variable
 
     foo = Variable.get("foo")
     foo_json = Variable.get("foo_baz", deserialize_json=True)
diff --git a/airflow-core/docs/index.rst b/airflow-core/docs/index.rst
index bfa29ba04a1..7fdc72e4f62 100644
--- a/airflow-core/docs/index.rst
+++ b/airflow-core/docs/index.rst
@@ -46,7 +46,7 @@ Take a look at the following snippet of code:
 
     from datetime import datetime
 
-    from airflow import DAG
+    from airflow.sdk import DAG
     from airflow.decorators import task
     from airflow.providers.standard.operators.bash import BashOperator
 

Reply via email to