This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-2-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit e2bb59823971773665bf10a2d8fb9b8a877330e1
Author: Josh Fell <[email protected]>
AuthorDate: Wed Dec 8 12:14:53 2021 -0500

    Fix grammar and typos in "Logging for Tasks" guide (#20146)
    
    (cherry picked from commit 70818319a038f1d17c179c278930b5b85035085d)
---
 .../logging-monitoring/logging-tasks.rst           | 36 +++++++++++-----------
 1 file changed, 18 insertions(+), 18 deletions(-)

diff --git a/docs/apache-airflow/logging-monitoring/logging-tasks.rst 
b/docs/apache-airflow/logging-monitoring/logging-tasks.rst
index 043f8f7..13cb248 100644
--- a/docs/apache-airflow/logging-monitoring/logging-tasks.rst
+++ b/docs/apache-airflow/logging-monitoring/logging-tasks.rst
@@ -20,18 +20,18 @@
 Logging for Tasks
 =================
 
-Airflow writes logs for tasks in a way that allows to see the logs for each 
task separately via Airflow UI.
-The Core Airflow implements writing and serving logs locally. However you can 
also write logs to remote
-services - via community providers, but you can also write your own loggers.
+Airflow writes logs for tasks in a way that allows you to see the logs for 
each task separately in the Airflow UI.
+Core Airflow implements writing and serving logs locally. However, you can 
also write logs to remote
+services via community providers, or write your own loggers.
 
-Below we describe the local task logging, but Apache Airflow Community also 
releases providers for many
-services (:doc:`apache-airflow-providers:index`) and some of them also provide 
handlers that extend logging
-capability of Apache Airflow. You can see all those providers in 
:doc:`apache-airflow-providers:core-extensions/logging`.
+Below we describe the local task logging, the Apache Airflow Community also 
releases providers for many
+services (:doc:`apache-airflow-providers:index`) and some of them provide 
handlers that extend the logging
+capability of Apache Airflow. You can see all of these providers in 
:doc:`apache-airflow-providers:core-extensions/logging`.
 
 Writing logs Locally
 --------------------
 
-Users can specify the directory to place log files in ``airflow.cfg`` using
+You can specify the directory to place log files in ``airflow.cfg`` using
 ``base_log_folder``. By default, logs are placed in the ``AIRFLOW_HOME``
 directory.
 
@@ -40,18 +40,18 @@ directory.
 
 The following convention is followed while naming logs: 
``{dag_id}/{task_id}/{logical_date}/{try_number}.log``
 
-In addition, users can supply a remote location to store current logs and 
backups.
+In addition, you can supply a remote location to store current logs and 
backups.
 
-In the Airflow Web UI, remote logs take precedence over local logs when remote 
logging is enabled. If remote logs
+In the Airflow UI, remote logs take precedence over local logs when remote 
logging is enabled. If remote logs
 can not be found or accessed, local logs will be displayed. Note that logs
-are only sent to remote storage once a task is complete (including failure); 
In other words, remote logs for
+are only sent to remote storage once a task is complete (including failure). 
In other words, remote logs for
 running tasks are unavailable (but local logs are available).
 
 
 Troubleshooting
 ---------------
 
-If you want to check which task handler is currently set, you can use 
``airflow info`` command as in
+If you want to check which task handler is currently set, you can use the 
``airflow info`` command as in
 the example below.
 
 .. code-block:: bash
@@ -67,7 +67,7 @@ the example below.
     Plugins Folder: [/root/airflow/plugins]
     Base Log Folder: [/root/airflow/logs]
 
-You can also use ``airflow config list`` to check that the logging 
configuration options have valid values.
+You can also run ``airflow config list`` to check that the logging 
configuration options have valid values.
 
 .. _write-logs-advanced:
 
@@ -75,9 +75,9 @@ Advanced configuration
 ----------------------
 
 Not all configuration options are available from the ``airflow.cfg`` file. 
Some configuration options require
-that the logging config class be overwritten. This can be done by 
``logging_config_class`` option
-in ``airflow.cfg`` file. This option should specify the import path indicating 
to a configuration compatible with
-:func:`logging.config.dictConfig`. If your file is a standard import location, 
then you should set a :envvar:`PYTHONPATH` environment.
+that the logging config class be overwritten. This can be done via the 
``logging_config_class`` option
+in ``airflow.cfg`` file. This option should specify the import path to a 
configuration compatible with
+:func:`logging.config.dictConfig`. If your file is a standard import location, 
then you should set a :envvar:`PYTHONPATH` environment variable.
 
 Follow the steps below to enable custom logging config class:
 
@@ -88,7 +88,7 @@ Follow the steps below to enable custom logging config class:
         export PYTHON_PATH=~/airflow/
 
 #. Create a directory to store the config file e.g. ``~/airflow/config``
-#. Create file called ``~/airflow/config/log_config.py`` with following 
content:
+#. Create file called ``~/airflow/config/log_config.py`` with following the 
contents:
 
     .. code-block:: python
 
@@ -113,14 +113,14 @@ See :doc:`../modules_management` for details on how 
Python and Airflow manage mo
 External Links
 --------------
 
-When using remote logging, users can configure Airflow to show a link to an 
external UI within the Airflow Web UI. Clicking the link redirects a user to 
the external UI.
+When using remote logging, you can configure Airflow to show a link to an 
external UI within the Airflow Web UI. Clicking the link redirects you to the 
external UI.
 
 Some external systems require specific configuration in Airflow for 
redirection to work but others do not.
 
 Serving logs from workers
 -------------------------
 
-Most task handlers send logs upon completion of a task. In order to view logs 
in real time, airflow automatically starts an http server to serve the logs in 
the following cases:
+Most task handlers send logs upon completion of a task. In order to view logs 
in real time, Airflow automatically starts an HTTP server to serve the logs in 
the following cases:
 
 - If ``SchedulerExecutor`` or ``LocalExecutor`` is used, then when ``airflow 
scheduler`` is running.
 - If ``CeleryExecutor`` is used, then when ``airflow worker`` is running.

Reply via email to