jedcunningham commented on code in PR #25410:
URL: https://github.com/apache/airflow/pull/25410#discussion_r940700643


##########
docs/apache-airflow/dag-run.rst:
##########
@@ -82,7 +81,7 @@ Data Interval
 
 Each DAG run in Airflow has an assigned "data interval" that represents the 
time
 range it operates in. For a DAG scheduled with ``@daily``, for example, each of
-its data interval would start each day at midnight (00:00) and end at midnight
+ts data interval would start each day at midnight (00:00) and end at midnight

Review Comment:
   ```suggestion
   its data interval would start each day at midnight (00:00) and end at 
midnight
   ```



##########
docs/apache-airflow/faq.rst:
##########
@@ -41,13 +41,13 @@ There are very many reasons why your task might not be 
getting scheduled. Here a
   "airflow" and "DAG" in order to prevent the DagBag parsing from importing 
all python
   files collocated with user's DAGs.
 
-- Is your ``start_date`` set properly? The Airflow scheduler triggers the
-  task soon after the ``start_date + schedule_interval`` is passed.
+- Is your ``start_date`` set properly? The Airflow scheduler won't trigger the 
task until the

Review Comment:
   ```suggestion
   - Is your ``start_date`` set properly? The Airflow scheduler won't trigger 
the task until
   ```



##########
scripts/in_container/verify_providers.py:
##########
@@ -200,6 +200,7 @@ class ProviderPackageDetails(NamedTuple):
     "distutils Version classes are deprecated. Use packaging.version instead.",
     "the imp module is deprecated in favour of importlib; "
     "see the module's documentation for alternative uses",
+    "see the module's documentation for alternative uses",

Review Comment:
   ```suggestion
   ```



##########
newsfragments/25410.significant.rst:
##########
@@ -0,0 +1,3 @@
+Deprecation of ``schedule_interval`` and ``timetable`` params
+
+We add new DAG parameter ``schedule`` that can accept a cron expression, 
timedelta object, *timetable* object, or list of dataset objects. Params 
``schedule_interval`` and ``timetable`` are  deprecated.

Review Comment:
   Might be worth adding a code example, "this becomes this" type thing.



##########
docs/apache-airflow/faq.rst:
##########
@@ -41,13 +41,13 @@ There are very many reasons why your task might not be 
getting scheduled. Here a
   "airflow" and "DAG" in order to prevent the DagBag parsing from importing 
all python
   files collocated with user's DAGs.
 
-- Is your ``start_date`` set properly? The Airflow scheduler triggers the
-  task soon after the ``start_date + schedule_interval`` is passed.
+- Is your ``start_date`` set properly? The Airflow scheduler won't trigger the 
task until the
+  after the first schedule interval following the start date has passed.

Review Comment:
   Might be worth adding a simple example here, since the 
"start_date+schedule_interval" doesn't work as cleanly any longer.



##########
docs/apache-airflow/dag-run.rst:
##########
@@ -117,8 +116,8 @@ DAG run fails.
 Catchup
 -------
 
-An Airflow DAG with a ``start_date``, possibly an ``end_date``, and a 
``schedule_interval`` defines a
-series of intervals which the scheduler turns into individual DAG Runs and 
executes. The scheduler, by default, will
+An Airflow DAG defined with a ``start_date``, possibly an ``end_date``, and a 
either a cron expression or timetable, defines a series of intervals which the 
scheduler turns into individual DAG Runs and executes.

Review Comment:
   ```suggestion
   An Airflow DAG defined with a ``start_date``(and possibly an ``end_date``), 
and either a cron expression or timetable, defines a series of intervals which 
the scheduler turns into individual DAG Runs and executes.
   ```
   
   I think the sentence still need a little love.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to