This is an automated email from the ASF dual-hosted git repository.

ephraimanierobi pushed a commit to branch v2-7-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit abfe7b89ef9d2f0b0ce46d146b2fd5d595dcfd08
Author: Vijayasarathi Balasubramanian <[email protected]>
AuthorDate: Fri Aug 4 15:14:43 2023 -0400

    Documentation Update to enhance Readability (#32832)
    
    * Documentation Update to enhance Readability
    
    Update to plugin.rst to enhance readability
    
    * Update docs/apache-airflow/authoring-and-scheduling/plugins.rst
    
    ---------
    
    Co-authored-by: eladkal <[email protected]>
    (cherry picked from commit aedefd6516101b182bc9ab3a6740ac5c0a8e2f9d)
---
 docs/apache-airflow/authoring-and-scheduling/plugins.rst | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/docs/apache-airflow/authoring-and-scheduling/plugins.rst 
b/docs/apache-airflow/authoring-and-scheduling/plugins.rst
index 2a9314f39e..f0c9f43ddc 100644
--- a/docs/apache-airflow/authoring-and-scheduling/plugins.rst
+++ b/docs/apache-airflow/authoring-and-scheduling/plugins.rst
@@ -82,9 +82,9 @@ start of each Airflow process, set ``[core] lazy_load_plugins 
= False`` in ``air
 This means that if you make any changes to plugins and you want the webserver 
or scheduler to use that new
 code you will need to restart those processes. However, it will not be 
reflected in new running tasks after the scheduler boots.
 
-By default, task execution will use forking to avoid the slow down of having 
to create a whole new python
-interpreter and re-parse all of the Airflow code and start up routines -- this 
is a big benefit for shorter
-running tasks. This does mean that if you use plugins in your tasks, and want 
them to update you will either
+By default, task execution uses forking. This avoids the slowdown associated 
with creating a new Python interpreter
+and re-parsing all of Airflow's code and startup routines. This approach 
offers significant benefits, especially for shorter tasks.
+This does mean that if you use plugins in your tasks, and want them to update 
you will either
 need to restart the worker (if using CeleryExecutor) or scheduler (Local or 
Sequential executors). The other
 option is you can accept the speed hit at start up set the 
``core.execute_tasks_new_python_interpreter``
 config setting to True, resulting in launching a whole new python interpreter 
for tasks.

Reply via email to