This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch v2-1-test
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit 67f04d205ea8ae5e91f521548ce7d84ace2b2e6c
Author: Collin McNulty <[email protected]>
AuthorDate: Thu Jul 29 10:48:27 2021 -0500

    Grammar and clarity pass on documentation (#17318)
    
    Minor grammar edits, fixes to broken links, and rewording for clarification.
    
    There are a few changes that others may disagree with me about:
    - Changed "outwith" to "instead of"
    - All non-code references I found to "time-zone" or "timezone" changed to 
"time zone"
    - It seems like top level pages are supposed to have capitalized words 
other than articles and prepositions, but two pages were not following this 
convention. I have changed them to conform to the others.
    - I found a sentence in the health checks section extremely confusing. I 
took my best attempt to restate it clearly, but I'm not sure I understood it 
well enough to restate it correctly.
    
    (cherry picked from commit 7b10b56a67ff935f28197e1916972b29398f667e)
---
 docs/apache-airflow/index.rst                        |  2 +-
 .../logging-monitoring/check-health.rst              |  8 ++++++--
 docs/apache-airflow/logging-monitoring/errors.rst    |  4 ++--
 docs/apache-airflow/logging-monitoring/metrics.rst   |  6 +++---
 docs/apache-airflow/start/index.rst                  |  2 +-
 docs/apache-airflow/timezone.rst                     | 20 ++++++++++----------
 6 files changed, 23 insertions(+), 19 deletions(-)

diff --git a/docs/apache-airflow/index.rst b/docs/apache-airflow/index.rst
index ba9db5d..18b2f7a 100644
--- a/docs/apache-airflow/index.rst
+++ b/docs/apache-airflow/index.rst
@@ -97,7 +97,7 @@ unit of work and continuity.
     lineage
     dag-serialization
     modules_management
-    Release policies <release-process>
+    Release Policies <release-process>
     changelog
     best-practices
     production-deployment
diff --git a/docs/apache-airflow/logging-monitoring/check-health.rst 
b/docs/apache-airflow/logging-monitoring/check-health.rst
index a5f8664..4468deb 100644
--- a/docs/apache-airflow/logging-monitoring/check-health.rst
+++ b/docs/apache-airflow/logging-monitoring/check-health.rst
@@ -20,9 +20,13 @@
 Checking Airflow Health Status
 ==============================
 
-Airflow has two methods to check the health of components - HTTP checks and 
CLI checks. Their choice depends on the role of the component as well as what 
tools it uses to monitor the deployment.
+Airflow has two methods to check the health of components - HTTP checks and 
CLI checks. All available checks are
+accessible through the CLI, but only some are accessible through HTTP due to 
the role of the component being checked
+and the tools being used to monitor the deployment.
 
-For example, when running on Kubernetes, use `a Liveness probes 
<https://kubernetes.io/docs/tasks/configure-pod-container/configure-liveness-readiness-startup-probes/>`__
 (``livenessProbe`` property) with :ref:`CLI checks 
<check-health/cli-checks-for-scheduler>` on the scheduler deployment to restart 
it when it fail. For the webserver, you can configure the readiness probe 
(``readinessProbe`` property) using :ref:`check-health/http-endpoint`.
+For example, when running on Kubernetes, use `a Liveness probes 
<https://kubernetes.io/docs/tasks/configure-pod-container/configure-liveness-readiness-startup-probes/>`__
 (``livenessProbe`` property)
+with :ref:`CLI checks <check-health/cli-checks-for-scheduler>` on the 
scheduler deployment to restart it when it fails.
+For the webserver, you can configure the readiness probe (``readinessProbe`` 
property) using :ref:`check-health/http-endpoint`.
 
 For an example for a Docker Compose environment, see the 
``docker-compose.yaml`` file available in the :doc:`/start/docker`.
 
diff --git a/docs/apache-airflow/logging-monitoring/errors.rst 
b/docs/apache-airflow/logging-monitoring/errors.rst
index 37ed307..578666b 100644
--- a/docs/apache-airflow/logging-monitoring/errors.rst
+++ b/docs/apache-airflow/logging-monitoring/errors.rst
@@ -41,7 +41,7 @@ Add your ``SENTRY_DSN`` to your configuration file e.g. 
``airflow.cfg`` in ``[se
 .. note::
     If this value is not provided, the SDK will try to read it from the 
``SENTRY_DSN`` environment variable.
 
-You can supply `additional configuration options 
<https://docs.sentry.io/error-reporting/configuration/?platform=python>`__ 
based on the Python platform via ``[sentry]`` section.
+You can supply `additional configuration options 
<https://docs.sentry.io/platforms/python/configuration/options>`__ based on the 
Python platform via ``[sentry]`` section.
 Unsupported options: ``integrations``, ``in_app_include``, ``in_app_exclude``, 
``ignore_errors``, ``before_breadcrumb``, ``before_send``, ``transport``.
 
 Tags
@@ -60,7 +60,7 @@ Breadcrumbs
 ------------
 
 
-When a task fails with an error `breadcrumbs 
<https://docs.sentry.io/enriching-error-data/breadcrumbs/?platform=python>`__ 
will be added for the other tasks in the current dag run.
+When a task fails with an error `breadcrumbs 
<https://docs.sentry.io/platforms/python/enriching-events/breadcrumbs/>`__ will 
be added for the other tasks in the current dag run.
 
 ======================================= 
==============================================================
 Name                                    Description
diff --git a/docs/apache-airflow/logging-monitoring/metrics.rst 
b/docs/apache-airflow/logging-monitoring/metrics.rst
index d410261..b55ad92 100644
--- a/docs/apache-airflow/logging-monitoring/metrics.rst
+++ b/docs/apache-airflow/logging-monitoring/metrics.rst
@@ -50,15 +50,15 @@ the metrics that start with the elements of the list:
     statsd_allow_list = scheduler,executor,dagrun
 
 If you want to redirect metrics to different name, you can configure 
``stat_name_handler`` option
-in ``[scheduler]`` section.  It should point to a function that validate the 
statsd stat name, apply changes
-to the stat name if necessary and return the transformed stat name. The 
function may looks as follow:
+in ``[scheduler]`` section.  It should point to a function that validates the 
statsd stat name, applies changes
+to the stat name if necessary, and returns the transformed stat name. The 
function may looks as follow:
 
 .. code-block:: python
 
     def my_custom_stat_name_handler(stat_name: str) -> str:
         return stat_name.lower()[:32]
 
-If you want to use a custom Statsd client outwith the default one provided by 
Airflow the following key must be added
+If you want to use a custom Statsd client instead of the default one provided 
by Airflow, the following key must be added
 to the configuration file alongside the module path of your custom Statsd 
client. This module must be available on
 your :envvar:`PYTHONPATH`.
 
diff --git a/docs/apache-airflow/start/index.rst 
b/docs/apache-airflow/start/index.rst
index c86ef83..b8f0c0b 100644
--- a/docs/apache-airflow/start/index.rst
+++ b/docs/apache-airflow/start/index.rst
@@ -15,7 +15,7 @@
     specific language governing permissions and limitations
     under the License.
 
-Quick start
+Quick Start
 ===========
 
 This section contains quick start guides to help you get up and running with 
Apache Airflow.
diff --git a/docs/apache-airflow/timezone.rst b/docs/apache-airflow/timezone.rst
index 63d1cec..d543c60 100644
--- a/docs/apache-airflow/timezone.rst
+++ b/docs/apache-airflow/timezone.rst
@@ -17,22 +17,22 @@
 
 
 
-Time zones
+Time Zones
 ==========
 
 Support for time zones is enabled by default. Airflow stores datetime 
information in UTC internally and in the database.
-It allows you to run your DAGs with time zone dependent schedules. At the 
moment Airflow does not convert them to the
-end user’s time zone in the user interface. There it will always be displayed 
in UTC. Also templates used in Operators
-are not converted. Time zone information is exposed and it is up to the writer 
of DAG what do with it.
+It allows you to run your DAGs with time zone dependent schedules. At the 
moment, Airflow does not convert them to the
+end user’s time zone in the user interface. It will always be displayed in UTC 
there. Also, templates used in Operators
+are not converted. Time zone information is exposed and it is up to the writer 
of DAG to decide what do with it.
 
 This is handy if your users live in more than one time zone and you want to 
display datetime information according to
 each user’s wall clock.
 
-Even if you are running Airflow in only one time zone it is still good 
practice to store data in UTC in your database
-(also before Airflow became time zone aware this was also to recommended or 
even required setup). The main reason is
-Daylight Saving Time (DST). Many countries have a system of DST, where clocks 
are moved forward in spring and backward
+Even if you are running Airflow in only one time zone, it is still good 
practice to store data in UTC in your database
+(also before Airflow became time zone aware this was also the recommended or 
even required setup). The main reason is
+that many countries use Daylight Saving Time (DST), where clocks are moved 
forward in spring and backward
 in autumn. If you’re working in local time, you’re likely to encounter errors 
twice a year, when the transitions
-happen. (The pendulum and pytz documentation discusses these issues in greater 
detail.) This probably doesn’t matter
+happen. (The pendulum and pytz documentation discuss these issues in greater 
detail.) This probably doesn’t matter
 for a simple DAG, but it’s a problem if you are in, for example, financial 
services where you have end of day
 deadlines to meet.
 
@@ -68,7 +68,7 @@ a datetime object is aware. Otherwise, it’s naive.
 
 You can use ``timezone.is_localized()`` and ``timezone.is_naive()`` to 
determine whether datetimes are aware or naive.
 
-Because Airflow uses time-zone-aware datetime objects. If your code creates 
datetime objects they need to be aware too.
+Because Airflow uses time zone aware datetime objects. If your code creates 
datetime objects they need to be aware too.
 
 .. code-block:: python
 
@@ -103,7 +103,7 @@ Unfortunately, during DST transitions, some datetimes don’t 
exist or are ambig
 In such situations, pendulum raises an exception. That’s why you should always 
create aware
 datetime objects when time zone support is enabled.
 
-In practice, this is rarely an issue. Airflow gives you aware datetime objects 
in the models and DAGs, and most often,
+In practice, this is rarely an issue. Airflow gives you time zone aware 
datetime objects in the models and DAGs, and most often,
 new datetime objects are created from existing ones through timedelta 
arithmetic. The only datetime that’s often
 created in application code is the current time, and ``timezone.utcnow()`` 
automatically does the right thing.
 

Reply via email to