amoghrajesh commented on code in PR #52297:
URL: https://github.com/apache/airflow/pull/52297#discussion_r2179507093


##########
airflow-core/docs/public-airflow-interface.rst:
##########
@@ -25,6 +36,13 @@ and extending Airflow capabilities by writing new executors, 
plugins, operators
 Public Interface can be useful for building custom tools and integrations with 
other systems,
 and for automating certain aspects of the Airflow workflow.
 
+In Airflow 3.0+, the primary public interface for DAG authors and task 
execution is the

Review Comment:
   ```suggestion
   The primary public interface for DAG authors and task execution is the
   ```
   
   The docs are versioned so no need for this.



##########
airflow-core/docs/public-airflow-interface.rst:
##########
@@ -25,6 +36,13 @@ and extending Airflow capabilities by writing new executors, 
plugins, operators
 Public Interface can be useful for building custom tools and integrations with 
other systems,
 and for automating certain aspects of the Airflow workflow.
 
+In Airflow 3.0+, the primary public interface for DAG authors and task 
execution is the
+:doc:`airflow.sdk namespace <core-concepts/taskflow>`. Direct access to the 
metadata database

Review Comment:
   ```suggestion
   using task SDK :doc:`airflow.sdk namespace <core-concepts/taskflow>`. Direct 
access to the metadata database
   ```
   
   Give or take on the syntax



##########
airflow-core/docs/public-airflow-interface.rst:
##########
@@ -25,6 +36,13 @@ and extending Airflow capabilities by writing new executors, 
plugins, operators
 Public Interface can be useful for building custom tools and integrations with 
other systems,
 and for automating certain aspects of the Airflow workflow.
 
+In Airflow 3.0+, the primary public interface for DAG authors and task 
execution is the
+:doc:`airflow.sdk namespace <core-concepts/taskflow>`. Direct access to the 
metadata database
+from task code is no longer allowed. Instead, use the :doc:`Stable REST API 
<stable-rest-api-ref>`,
+`Python Client <https://github.com/apache/airflow-client-python>`_, or Task 
Context methods.

Review Comment:
   ```suggestion
   `Python Client <https://github.com/apache/airflow-client-python>`_, or Task 
Context methods.
   ```
   `Task Context methods.` is not v clear to me.
   



##########
airflow-core/docs/public-airflow-interface.rst:
##########
@@ -56,13 +74,70 @@ way, the Stable REST API is recommended.
 Using the Public Interface for DAG Authors
 ==========================================
 
+The primary interface for DAG authors in Airflow 3.0+ is the :doc:`airflow.sdk 
namespace <core-concepts/taskflow>`.

Review Comment:
   ```suggestion
   The primary interface for DAG Authors is :doc:`airflow.sdk namespace 
<core-concepts/taskflow>`.
   ```



##########
airflow-core/docs/core-concepts/xcoms.rst:
##########
@@ -91,7 +111,10 @@ Custom XCom Backends
 
 The XCom system has interchangeable backends, and you can set which backend is 
being used via the ``xcom_backend`` configuration option.
 
-If you want to implement your own backend, you should subclass 
:class:`~airflow.models.xcom.BaseXCom`, and override the ``serialize_value`` 
and ``deserialize_value`` methods.
+If you want to implement your own backend, you should subclass 
:class:`~airflow.sdk.execution_time.xcom.XCom`, and override the 
``serialize_value`` and ``deserialize_value`` methods.

Review Comment:
   No, we should subclass `~airflow.models.xcom.BaseXCom` itself, this is 
confusing. 



##########
airflow-core/docs/core-concepts/xcoms.rst:
##########
@@ -73,7 +76,24 @@ An example of pushing multiple XComs and pulling them 
individually:
         # Pulling entire xcom data from push_multiple task
         data = context["ti"].xcom_pull(task_ids="push_multiple", 
key="return_value")
 
+You can also use the Task Context directly for XCom operations:
+
+.. code-block:: python
 
+    from airflow.sdk import get_current_context
+
+
+    @task
+    def example_task():
+        context = get_current_context()
+        ti = context["ti"]
+
+        # Push XCom
+        ti.xcom_push(key="my_key", value="my_value")
+
+        # Pull XCom
+        value = ti.xcom_pull(task_ids="previous_task", key="my_key")
+        return value

Review Comment:
   Is this needed? Everything below: 
https://github.com/apache/airflow/pull/52297/files#diff-1db2e4edbed2b0e544d8fdde02d0d3628041ec0c4855fec83930ffcac1da60efR31
 only discusses that



##########
airflow-core/docs/public-airflow-interface.rst:
##########
@@ -56,13 +74,70 @@ way, the Stable REST API is recommended.
 Using the Public Interface for DAG Authors
 ==========================================
 
+The primary interface for DAG authors in Airflow 3.0+ is the :doc:`airflow.sdk 
namespace <core-concepts/taskflow>`.
+This provides a stable, well-defined interface for creating DAGs and tasks 
that is not subject to internal
+implementation changes. The goal of this change is to decouple DAG authoring 
from Airflow internals (Scheduler,
+API Server, etc.), providing a forward-compatible, stable interface for 
writing and maintaining DAGs across Airflow versions.
+
+**Key Imports from airflow.sdk:**
+
+**Classes:**
+
+* ``Asset``
+* ``BaseNotifier``
+* ``BaseOperator``
+* ``BaseOperatorLink``
+* ``BaseSensorOperator``
+* ``Connection``
+* ``Context``
+* ``DAG``
+* ``EdgeModifier``
+* ``Label``
+* ``ObjectStoragePath``
+* ``Param``
+* ``TaskGroup``
+* ``Variable``

Review Comment:
   `BaseHook` merged in main too.



##########
airflow-core/docs/public-airflow-interface.rst:
##########
@@ -56,13 +74,70 @@ way, the Stable REST API is recommended.
 Using the Public Interface for DAG Authors
 ==========================================
 
+The primary interface for DAG authors in Airflow 3.0+ is the :doc:`airflow.sdk 
namespace <core-concepts/taskflow>`.
+This provides a stable, well-defined interface for creating DAGs and tasks 
that is not subject to internal
+implementation changes. The goal of this change is to decouple DAG authoring 
from Airflow internals (Scheduler,
+API Server, etc.), providing a forward-compatible, stable interface for 
writing and maintaining DAGs across Airflow versions.

Review Comment:
   ```suggestion
   API Server, etc.), providing a version-agnostic, stable interface for 
writing and maintaining DAGs across Airflow versions.
   ```
   
   Cannot commit to fwd compatible as of now i guess. CC @ashb / @kaxil 
   
   



##########
airflow-core/docs/public-airflow-interface.rst:
##########
@@ -77,64 +152,86 @@ You can read more about dags in :doc:`Dags 
<core-concepts/dags>`.
 
 References for the modules used in dags are here:
 
-.. toctree::
-  :includehidden:
-  :glob:
-  :maxdepth: 1
-
-  _api/airflow/models/dag/index
-  _api/airflow/models/dagbag/index
+.. note::
+   The airflow.sdk namespace provides the primary interface for DAG authors in 
Airflow 3.0+.
+   For detailed API documentation, see the `Task SDK Reference 
<https://airflow.apache.org/docs/task-sdk/stable/>`_.
 
-Properties of a :class:`~airflow.models.dagrun.DagRun` can also be referenced 
in things like :ref:`Templates <templates-ref>`.
-
-.. toctree::
-  :includehidden:
-  :glob:
-  :maxdepth: 1
+.. note::
+   The :class:`~airflow.models.dagbag.DagBag` class is used internally by 
Airflow for loading DAGs
+   from files and folders. DAG authors should use the 
:class:`~airflow.sdk.DAG` class from the
+   airflow.sdk namespace instead.
 
-  _api/airflow/models/dagrun/index
+.. note::
+   The :class:`~airflow.models.dagrun.DagRun` class is used internally by 
Airflow for DAG run
+   management. DAG authors should access DAG run information through the Task 
Context via
+   :func:`~airflow.sdk.get_current_context` or use the 
:class:`~airflow.sdk.types.DagRunProtocol`
+   interface.
 
 .. _pythonapi:operators:
 
 Operators
----------
+=========
 
-The base classes :class:`~airflow.models.baseoperator.BaseOperator` and 
:class:`~airflow.sensors.base.BaseSensorOperator` are public and may be 
extended to make new operators.
+The base classes :class:`~airflow.sdk.BaseOperator` and 
:class:`~airflow.sdk.BaseSensorOperator` are public and may be extended to make 
new operators.
+
+The recommended base class for new operators is 
:class:`~airflow.sdk.BaseOperator`
+from the airflow.sdk namespace.
 
 Subclasses of BaseOperator which are published in Apache Airflow are public in 
*behavior* but not in *structure*.  That is to say, the Operator's parameters 
and behavior is governed by semver but the methods are subject to change at any 
time.
 
 Task Instances
---------------
+==============
 
 Task instances are the individual runs of a single task in a DAG (in a DAG 
Run). They are available in the context
-passed to the execute method of the operators via the 
:class:`~airflow.models.taskinstance.TaskInstance` class.
+passed to the execute method of the operators via the 
:class:`~airflow.sdk.types.RuntimeTaskInstanceProtocol` class.
 
-.. toctree::
-  :includehidden:
-  :glob:
-  :maxdepth: 1
-
-  _api/airflow/models/taskinstance/index
+In Airflow 3.0+, task instances are accessed through the Task Context via 
:func:`~airflow.sdk.get_current_context`

Review Comment:
   Don't add versions in docs please. Remove all such occurrences



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to