KKcorps commented on a change in pull request #6295: [AIRFLOW-XXX] GSoD: Adding 
Task re-run documentation
URL: https://github.com/apache/airflow/pull/6295#discussion_r349855620
 
 

 ##########
 File path: docs/scheduler.rst
 ##########
 @@ -32,161 +30,31 @@ Airflow production environment. To kick it off, all you 
need to do is
 execute ``airflow scheduler``. It will use the configuration specified in
 ``airflow.cfg``.
 
-Note that if you run a DAG on a ``schedule_interval`` of one day,
-the run stamped ``2016-01-01`` will be triggered soon after 
``2016-01-01T23:59``.
-In other words, the job instance is started once the period it covers
-has ended.
-
-**Let's Repeat That** The scheduler runs your job one ``schedule_interval`` 
AFTER the
-start date, at the END of the period.
-
-The scheduler starts an instance of the executor specified in the your
-``airflow.cfg``. If it happens to be the 
:class:`airflow.contrib.executors.local_executor.LocalExecutor`, tasks will be
-executed as subprocesses; in the case of 
:class:`airflow.executors.celery_executor.CeleryExecutor`, and 
:class:`airflow.executors.dask_executor.DaskExecutor` tasks are executed 
remotely.
+The scheduler starts an instance of an :doc:`Executor </executor/index>`. 
 
 To start a scheduler, simply run the command:
 
 .. code:: bash
 
     airflow scheduler
 
+You can start executing a DAG once your scheduler has started running 
successfully.
 
 Review comment:
   I'll change it to 'your dags will start executing' 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to