Copilot commented on code in PR #64740:
URL: https://github.com/apache/airflow/pull/64740#discussion_r3066481828


##########
airflow-core/docs/installation/upgrading_to_airflow3.rst:
##########
@@ -373,9 +373,51 @@ These include:
   - ``execution_date``
 - The ``catchup_by_default`` Dag parameter is now ``False`` by default.
 - The ``create_cron_data_intervals`` configuration is now ``False`` by 
default. This means that the ``CronTriggerTimetable`` will be used by default 
instead of the ``CronDataIntervalTimetable``
+- **Manual DAG runs and data intervals**: In Airflow 3, do not assume that a 
manually triggered Dag run's ``data_interval`` is derived from, or equal to, 
the supplied ``logical_date``. If your DAG logic needs the user-specified 
trigger date, use ``logical_date`` explicitly. This especially affects 
workflows that read ``data_interval_start`` or ``data_interval_end`` during 
manual triggering or when using ``TriggerDagRunOperator``. For detailed 
migration guidance, see :ref:`data-interval-manual-triggering`.

Review Comment:
   The new bullet mixes “DAG” and “Dag” in the same sentence ("Manual DAG 
runs…" but then "manually triggered Dag run's"). Please standardize 
capitalization/terminology within this bullet to avoid confusion (e.g., 
consistently use “DAG” throughout, or consistently “Dag” to match the rest of 
this document).
   ```suggestion
   - **Manual DAG runs and data intervals**: In Airflow 3, do not assume that a 
manually triggered DAG run's ``data_interval`` is derived from, or equal to, 
the supplied ``logical_date``. If your DAG logic needs the user-specified 
trigger date, use ``logical_date`` explicitly. This especially affects 
workflows that read ``data_interval_start`` or ``data_interval_end`` during 
manual triggering or when using ``TriggerDagRunOperator``. For detailed 
migration guidance, see :ref:`data-interval-manual-triggering`.
   ```



##########
airflow-core/docs/installation/upgrading_to_airflow3.rst:
##########
@@ -373,9 +373,51 @@ These include:
   - ``execution_date``
 - The ``catchup_by_default`` Dag parameter is now ``False`` by default.
 - The ``create_cron_data_intervals`` configuration is now ``False`` by 
default. This means that the ``CronTriggerTimetable`` will be used by default 
instead of the ``CronDataIntervalTimetable``
+- **Manual DAG runs and data intervals**: In Airflow 3, do not assume that a 
manually triggered Dag run's ``data_interval`` is derived from, or equal to, 
the supplied ``logical_date``. If your DAG logic needs the user-specified 
trigger date, use ``logical_date`` explicitly. This especially affects 
workflows that read ``data_interval_start`` or ``data_interval_end`` during 
manual triggering or when using ``TriggerDagRunOperator``. For detailed 
migration guidance, see :ref:`data-interval-manual-triggering`.
 - **Simple Auth** is now default ``auth_manager``. To continue using FAB as 
the Auth Manager, please install the FAB provider and set ``auth_manager`` to 
``FabAuthManager``:
 
   .. code-block:: ini
 
       airflow.providers.fab.auth_manager.fab_auth_manager.FabAuthManager
 - **AUTH API** api routes defined in the auth manager are prefixed with the 
``/auth`` route. Urls consumed outside of the application such as oauth 
redirect urls will have to updated accordingly. For example an oauth redirect 
url that was ``https://<your-airflow-url.com>/oauth-authorized/google`` in 
Airflow 2.x will be 
``https://<your-airflow-url.com>/auth/oauth-authorized/google`` in Airflow 3.x
+
+.. _data-interval-manual-triggering:
+
+Manual DAG Runs and ``logical_date``
+====================================
+
+For scheduled runs, ``logical_date`` and ``data_interval`` are both derived 
from
+the DAG's timetable.
+
+For manually triggered runs in Airflow 3, do not assume that
+``data_interval_start`` or ``data_interval_end`` are derived from, or equal to,
+the supplied ``logical_date``. The resulting ``data_interval`` depends on the
+timetable and the trigger path, and some APIs also allow the data interval to
+be provided explicitly.
+
+This matters most for DAGs that:
+
+- use ``data_interval_start`` or ``data_interval_end`` during manual runs
+- trigger downstream DAGs with ``TriggerDagRunOperator``
+- migrated from Airflow 2 and treated ``data_interval_start`` as the requested
+  manual run date
+
+Migration guidance
+------------------
+
+If your DAG logic needs the user-specified date for a manual run, use
+``logical_date`` explicitly.
+
+.. code-block:: python
+
+   @task
+   def process_data(context):

Review Comment:
   The example uses a TaskFlow `@task` callable that accepts a `context` 
argument, but elsewhere in the docs the recommended pattern is to call 
`get_current_context()` from inside the task (TaskFlow tasks don’t receive an 
execution context positional arg by default). Consider updating the snippet to 
use `get_current_context()` to retrieve `logical_date`, consistent with e.g. 
docs/tutorial/taskflow.rst and docs/core-concepts/variables.rst.
   ```suggestion
      from airflow.decorators import get_current_context, task
   
      @task
      def process_data():
          context = get_current_context()
   ```



##########
airflow-core/docs/core-concepts/dag-run.rst:
##########
@@ -76,6 +76,24 @@ scheduled one interval after ``start_date``.
     For more information on ``logical date``, see :ref:`concepts-dag-run` and
     :ref:`faq:what-does-execution-date-mean`
 
+Manual Triggering and Data Intervals
+'''''''''''''''''''''''''''''''''''''
+
+When you manually trigger a DAG (for example from the UI, CLI, REST API, or
+``TriggerDagRunOperator``), do not assume the run's ``data_interval`` is
+derived from, or equal to, the supplied ``logical_date``.
+

Review Comment:
   This page mostly uses “Dag”/“Dag Run” terminology (e.g., “A Dag Run…” near 
the top), but the new section switches to “DAG” (“When you manually trigger a 
DAG…”). Please make the wording consistent within this page to avoid 
distracting terminology changes.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to