This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/main by this push:
     new e756d99909 Fixing some typos in various texts (#1253)
e756d99909 is described below

commit e756d999092240efdf37755698e05f32f2a74e9c
Author: Didier Durand <[email protected]>
AuthorDate: Fri Oct 24 11:26:07 2025 +0200

    Fixing some typos in various texts (#1253)
---
 landing-pages/site/content/en/announcements/_index.md          |  6 +++---
 landing-pages/site/content/en/blog/airflow-1.10.10/index.md    |  8 ++++----
 .../site/content/en/blog/airflow-1.10.8-1.10.9/index.md        |  4 ++--
 landing-pages/site/content/en/blog/airflow-2.2.0/index.md      |  2 +-
 landing-pages/site/content/en/blog/airflow-3.1.0/index.md      |  2 +-
 .../site/content/en/blog/airflow-survey-2022/index.md          |  4 ++--
 .../site/content/en/blog/apache-airflow-for-newcomers/index.md |  2 +-
 .../blog/documenting-using-local-development-environments.md   |  2 +-
 ...rience-in-google-season-of-docs-2019-with-apache-airflow.md | 10 +++++-----
 landing-pages/site/content/en/ecosystem/_index.md              |  2 +-
 10 files changed, 21 insertions(+), 21 deletions(-)

diff --git a/landing-pages/site/content/en/announcements/_index.md 
b/landing-pages/site/content/en/announcements/_index.md
index 2e480518d2..92dba04ae9 100644
--- a/landing-pages/site/content/en/announcements/_index.md
+++ b/landing-pages/site/content/en/announcements/_index.md
@@ -1338,7 +1338,7 @@ Pypi - https://pypi.python.org/pypi/apache-airflow (Run 
`pip install apache-airf
 
 Changelog - https://airflow.apache.org/changelog.html#airflow-1-10-2-2019-01-19
 
-By default one of Airflow's dependencies installs a GPL dependency 
(unidecode). To avoid this dependency set **SLUGIFY_USES_TEXT_UNIDECODE=yes** 
in your environment when you install or upgrade Airflow. To force installing 
the GPL version set **AIRFLOW_GPL_UNIDECODE**. One of these two environment 
variables must be specified.
+By default, one of Airflow's dependencies installs a GPL dependency 
(unidecode). To avoid this dependency set **SLUGIFY_USES_TEXT_UNIDECODE=yes** 
in your environment when you install or upgrade Airflow. To force installing 
the GPL version set **AIRFLOW_GPL_UNIDECODE**. One of these two environment 
variables must be specified.
 
 # Jan 9, 2019
 
@@ -1370,7 +1370,7 @@ Pypi - https://pypi.python.org/pypi/apache-airflow (Run 
`pip install apache-airf
 
 Changelog - 
https://github.com/apache/incubator-airflow/blob/v1-10-test/CHANGELOG.txt
 
-By default one of Airflow's dependencies installs a GPL dependency 
(unidecode). To avoid this dependency set **SLUGIFY_USES_TEXT_UNIDECODE=yes** 
in your environment when you install or upgrade Airflow. To force installing 
the GPL version set **AIRFLOW_GPL_UNIDECODE**. One of these two environment 
variables must be specified.
+By default, one of Airflow's dependencies installs a GPL dependency 
(unidecode). To avoid this dependency set **SLUGIFY_USES_TEXT_UNIDECODE=yes** 
in your environment when you install or upgrade Airflow. To force installing 
the GPL version set **AIRFLOW_GPL_UNIDECODE**. One of these two environment 
variables must be specified.
 
 
 # Oct 16, 2018
@@ -1417,7 +1417,7 @@ Pypi - https://pypi.python.org/pypi/apache-airflow (Run 
`pip install apache-airf
 
 Changelog - 
https://github.com/apache/incubator-airflow/blob/8100f1f/CHANGELOG.txt
 
-By default one of Airflow's dependencies installs a GPL dependency 
(unidecode). To avoid this dependency set **SLUGIFY_USES_TEXT_UNIDECODE=yes** 
in your environment when you install or upgrade Airflow. To force installing 
the GPL version set **AIRFLOW_GPL_UNIDECODE**. One of these two environment 
variables must be specified.
+By default, one of Airflow's dependencies installs a GPL dependency 
(unidecode). To avoid this dependency set **SLUGIFY_USES_TEXT_UNIDECODE=yes** 
in your environment when you install or upgrade Airflow. To force installing 
the GPL version set **AIRFLOW_GPL_UNIDECODE**. One of these two environment 
variables must be specified.
 
 
 # Aug 3, 2018
diff --git a/landing-pages/site/content/en/blog/airflow-1.10.10/index.md 
b/landing-pages/site/content/en/blog/airflow-1.10.10/index.md
index b693f70ed6..4510a5f7da 100644
--- a/landing-pages/site/content/en/blog/airflow-1.10.10/index.md
+++ b/landing-pages/site/content/en/blog/airflow-1.10.10/index.md
@@ -20,14 +20,14 @@ Airflow 1.10.10 contains 199 commits since 1.10.9 and 
includes 11 new features,
 
 Some of the noteworthy new features (user-facing) are:
 
-- [Allow user to chose timezone to use in the RBAC 
UI](https://github.com/apache/airflow/pull/8046)
+- [Allow user to choose timezone to use in the RBAC 
UI](https://github.com/apache/airflow/pull/8046)
 - [Add Production Docker image 
support](https://github.com/apache/airflow/pull/7832)
 - [Allow Retrieving Airflow Connections & Variables from various Secrets 
backend](http://airflow.apache.org/docs/1.10.10/howto/use-alternative-secrets-backend.html)
 - [Stateless Webserver using DAG 
Serialization](http://airflow.apache.org/docs/1.10.10/dag-serialization.html)
 - [Tasks with Dummy Operators are no longer sent to 
executor](https://github.com/apache/airflow/pull/7880)
 - [Allow passing DagRun conf when triggering dags via 
UI](https://github.com/apache/airflow/pull/7312)
 
-### Allow user to chose timezone to use in the RBAC UI
+### Allow user to choose timezone to use in the RBAC UI
 
 By default the Web UI will show times in UTC. It is possible to change the 
timezone shown by using the menu in the top
  right (click on the clock to activate it):
@@ -41,7 +41,7 @@ Details: 
https://airflow.apache.org/docs/1.10.10/timezone.html#web-ui
 
 ### Add Production Docker image support
 
-There are brand new production images (alpha quality) available for Airflow 
1.10.10. You can pull them from the
+There are brand-new production images (alpha quality) available for Airflow 
1.10.10. You can pull them from the
 [Apache Airflow Dockerhub](https://hub.docker.com/r/apache/airflow) repository 
and start using it.
 
 More information about using production images can be found in 
https://github.com/apache/airflow/blob/master/IMAGES.rst#using-the-images. Soon 
it will be updated with
@@ -137,7 +137,7 @@ If you are updating Apache Airflow from a previous version 
to `1.10.10`, please
 
     More details in https://github.com/apache/airflow/pull/7464.
 
--   Setting empty string to a Airflow Variable will now return an empty 
string, it previously returned `None`.
+-   Setting empty string to an Airflow Variable will now return an empty 
string, it previously returned `None`.
 
     Example:
 
diff --git a/landing-pages/site/content/en/blog/airflow-1.10.8-1.10.9/index.md 
b/landing-pages/site/content/en/blog/airflow-1.10.8-1.10.9/index.md
index 36e12e6b81..a2209c08c9 100644
--- a/landing-pages/site/content/en/blog/airflow-1.10.8-1.10.9/index.md
+++ b/landing-pages/site/content/en/blog/airflow-1.10.8-1.10.9/index.md
@@ -31,7 +31,7 @@ Some of the noteworthy new features (user-facing) are:
 
 ### Add tags to DAGs and use it for filtering in the UI
 
-In order to filter DAGs (e.g by team), you can add tags in each dag. The 
filter is saved in a cookie and can be reset by the reset button.
+In order to filter DAGs (e.g. by team), you can add tags in each dag. The 
filter is saved in a cookie and can be reset by the reset button.
 
 For example:
 
@@ -62,7 +62,7 @@ We strongly recommend users to use Python >= 3.6
 ### Use Airflow RBAC UI
 Airflow 1.10.9 ships with 2 UIs, the default is non-RBAC Flask-admin based UI 
and Flask-appbuilder based UI.
 
-The Flask-AppBuilder (FAB) based UI is allows Role-based Access Control and 
has more advanced features compared to
+The Flask-AppBuilder (FAB) based UI is allowed Role-based Access Control and 
has more advanced features compared to
 the legacy Flask-admin based UI. This UI can be enabled by setting `rbac=True` 
in `[webserver]` section in your `airflow.cfg`.
 
 Flask-admin based UI is deprecated and new features won't be ported to it. 
This UI will still be the default
diff --git a/landing-pages/site/content/en/blog/airflow-2.2.0/index.md 
b/landing-pages/site/content/en/blog/airflow-2.2.0/index.md
index 1948f17d40..20367f4c1f 100644
--- a/landing-pages/site/content/en/blog/airflow-2.2.0/index.md
+++ b/landing-pages/site/content/en/blog/airflow-2.2.0/index.md
@@ -63,7 +63,7 @@ More information on the `@task.docker` decorator can be found 
at: [Using the Tas
 
 You can now apply validation on DAG params by passing a `Param` object for 
each param. The `Param` object supports the full [json-schema validation 
specifications](https://json-schema.org/draft/2020-12/json-schema-validation.html).
 
-Currently this only functions with manually triggered DAGs, but it does set 
the stage for future params related functionality.
+Currently, this only functions with manually triggered DAGs, but it does set 
the stage for future params related functionality.
 
 More information can be found at: 
[Params](https://airflow.apache.org/docs/apache-airflow/stable/concepts/params.html)
 
diff --git a/landing-pages/site/content/en/blog/airflow-3.1.0/index.md 
b/landing-pages/site/content/en/blog/airflow-3.1.0/index.md
index 6f06717b4e..bba2f643ce 100644
--- a/landing-pages/site/content/en/blog/airflow-3.1.0/index.md
+++ b/landing-pages/site/content/en/blog/airflow-3.1.0/index.md
@@ -138,7 +138,7 @@ The new **React Plugin System** (**AIP-68**) transforms how 
you extend Airflow's
 the old Flask-based approach with a modern toolkit that lets you customize 
Airflow exactly how your team works.
 
 Want to embed your company's dashboard right in the Airflow UI? Build React 
applications or iframes that will
-render inside Airflow's (nav bar, dashboard, details page, etc). Want to link 
to your existing tools
+render inside Airflow's (nav bar, dashboard, details page, etc.). Want to link 
to your existing tools
 seamlessly? Create custom external links to your resources. Want to extend 
Airflow's API server? Register
 FastAPI sub applications and middlewares that fit your specific processes.
 
diff --git a/landing-pages/site/content/en/blog/airflow-survey-2022/index.md 
b/landing-pages/site/content/en/blog/airflow-survey-2022/index.md
index 867c97657a..71101fe75c 100644
--- a/landing-pages/site/content/en/blog/airflow-survey-2022/index.md
+++ b/landing-pages/site/content/en/blog/airflow-survey-2022/index.md
@@ -28,7 +28,7 @@ The raw response data will be made available here soon, in 
the meantime, feel fr
 ### Deployments
 
 - 85% of the Airflow users have between 1 and 7 active Airflow instances. 
62.5% of the Airflow users have between 11 and 250 DAGs in their largest 
Airflow instance. 75% of the surveyed Airflow users have between 1 and 100 
tasks per DAG.
-- Close to 85% of users use one of the Airflow 2 versions, 9.2% users still 
use 1.10.15, while the remaining 6.3% are still using olderAirflow 1 versions. 
The good news is that the majority of users on Airflow 1 are planning migration 
to Airflow 2 quite soon, with resources and capacity being the main blockers.
+- Close to 85% of users use one of the Airflow 2 versions, 9.2% users still 
use 1.10.15, while the remaining 6.3% are still using older Airflow 1 versions. 
The good news is that the majority of users on Airflow 1 are planning migration 
to Airflow 2 quite soon, with resources and capacity being the main blockers.
 - In comparison to results from 
[2020](https://airflow.apache.org/blog/airflow-survey-2020/#overview-of-the-user),
 more users were interested in monitoring in general and specifically in using 
tools such as external monitoring services (40.7%, up from 29.6%) and 
information from metabase (35.7%, up from 25.1%).
 - Celery (52.7%) and Kubernetes (39.4%) are the most common executors used.
 
@@ -160,7 +160,7 @@ Airflow documentation is a critical source of information, 
with more than 90% of
 | 1000+    | 10  | 4.8%  |
 | 501-1000 | 9   | 4.3%  |
 
-62.5% of the Airflow users surveyed have between 11 to 250 DAGs in their 
largest Airflow instance.
+62.5% of the Airflow users surveyed have between 11 and 250 DAGs in their 
largest Airflow instance.
 
 ### How many active Airflow instances do you have? (single choice)
 
diff --git 
a/landing-pages/site/content/en/blog/apache-airflow-for-newcomers/index.md 
b/landing-pages/site/content/en/blog/apache-airflow-for-newcomers/index.md
index 5566da7e45..8342fa6e50 100644
--- a/landing-pages/site/content/en/blog/apache-airflow-for-newcomers/index.md
+++ b/landing-pages/site/content/en/blog/apache-airflow-for-newcomers/index.md
@@ -120,7 +120,7 @@ calls a python function, AwsBatchOperator which executes a 
job on AWS Batch and
 
 #### Sensors
 Sensors can be described as special operators that are used to monitor a 
long-running task.
-Just like Operators, there are many predefined sensors in Airflow. These 
includes
+Just like Operators, there are many predefined sensors in Airflow. These 
include
 
   - AthenaSensor: Asks for the state of the Query until it reaches a failure 
state or success state.
   - AzureCosmosDocumentSensor: Checks for the existence of a document which 
matches the given query in CosmosDB
diff --git 
a/landing-pages/site/content/en/blog/documenting-using-local-development-environments.md
 
b/landing-pages/site/content/en/blog/documenting-using-local-development-environments.md
index 56418829a1..770366ccb5 100644
--- 
a/landing-pages/site/content/en/blog/documenting-using-local-development-environments.md
+++ 
b/landing-pages/site/content/en/blog/documenting-using-local-development-environments.md
@@ -11,7 +11,7 @@ date: 2019-11-22
 
 ## Documenting local development environment of Apache Airflow
 
-From Sept to November, 2019 I have been participating in a wonderful 
initiative, [Google Season of 
Docs](https://developers.google.com/season-of-docs).
+From Sept to November 2019 I have been participating in a wonderful 
initiative, [Google Season of 
Docs](https://developers.google.com/season-of-docs).
 
 I had a pleasure to contribute to the Apache Airflow open source project as a 
technical writer.
 My initial assignment was an extension to the GitHub-based Contribution guide.
diff --git 
a/landing-pages/site/content/en/blog/experience-in-google-season-of-docs-2019-with-apache-airflow.md
 
b/landing-pages/site/content/en/blog/experience-in-google-season-of-docs-2019-with-apache-airflow.md
index f2cc732b43..dd7174c6d7 100644
--- 
a/landing-pages/site/content/en/blog/experience-in-google-season-of-docs-2019-with-apache-airflow.md
+++ 
b/landing-pages/site/content/en/blog/experience-in-google-season-of-docs-2019-with-apache-airflow.md
@@ -28,13 +28,13 @@ The second one was [Apache Cassandra][3], on which I also 
had worked extensively
 Considering the total experience, I decided to go with the Airflow.
 
 ## Project selection
-After selecting the org, the next step was to choose the project. Again, my 
previous experience played a role here, and I ended up picking the **How to 
create a workflow** . The aim of the project was to write documentation which 
will help users in creating complex as well as custom DAGs.  
+After selecting the org, the next step was to choose the project. Again, my 
previous experience played a role here, and I ended up picking the **How to 
create a workflow** . The aim of the project was to write documentation which 
will help users in creating complex as well as custom DAGs.
 The final deliverables were a bit different, though. More on that later.
 
 After submitting my application, I got involved in my job until one day, I saw 
a mail from google confirming my selection as a Technical Writer for the 
project.
 
 ## Community Bonding
-Getting selected is just a beginning.  I got the invite to the Airflow slack 
channel where most of the discussions happened.
+Getting selected is just a beginning.  I got the invite to the Airflow Slack 
channel where most of the discussions happened.
 My mentor was [Ash-Berlin Taylor][4] from Apache Airflow. I started talking to 
my mentor to get a general sense of what deliverables were expected. The 
deliverables were documented in [confluence][5].
 
 - A page for how to create a DAG that also includes:
@@ -57,12 +57,12 @@ After connecting with the mentor, I started engaging with 
the overall Airflow co
 
 ## Doc Development
 I picked DAG run as my first deliverable. I chose this topic as some parts of 
it were already documented but needed some additional text.
-I splitter the existing Scheduling & Triggers page into two new pages.
+I split the existing Scheduling & Triggers page into two new pages.
 1. Schedulers
 2. DAG Runs
 
 Most of the details unrelated to schedulers were moved to DAG runs page, and 
then missing points such as how to re-run a task or DAG were added.
-Once I was satisfied with my version, I asked my mentor and Kamil to review 
it. For the first version, I shared the text in the Google docs file in which 
the reviewers added comments.
+Once I was satisfied with my version, I asked my mentor and Kamil to review 
it. For the first version, I shared the text in the Google Docs file in which 
the reviewers added comments.
 However, the document started getting messy, and it became difficult to track 
the changes. The time had come now to raise a proper Pull Request.
 
 This was the time when I faced my first challenge. The documentation of Apache 
Airflow is written using RST(reStructuredText) syntax, with which I was 
entirely unfamiliar. I had mostly worked in Markdown.
@@ -79,7 +79,7 @@ This required a bit of trial and error. I studied the current 
pattern in Airflow
 
 In the end, all the reviewers approved the PR, but it was not merged until two 
months later. This was because we doubted if some more pages, such as 
**Concepts**, should also be split up, resulting in a better-structured 
document. In the end, we decided to delay it until we discussed it with the 
broader community.
 
-My [second PR][9] was a completely new document. It was related to How to 
create your custom operator. For this, since now I was familiar with most of 
the syntax, I directly raised the PR without going via Google docs. I received 
a lot of comments again, but this time they were more related to what I had 
written rather than how I had written it.
+My [second PR][9] was a completely new document. It was related to How to 
create your custom operator. For this, since now I was familiar with most of 
the syntax, I directly raised the PR without going via Google Docs. I received 
a lot of comments again, but this time they were more related to what I had 
written rather than how I had written it.
 e.g., Describing in detail how to use **template fields** and clean up my code 
examples. The fewer grammatical & formatting error comments showed I had made 
progress.
 The PR was accepted within two weeks and gave me a huge confidence boost.
 
diff --git a/landing-pages/site/content/en/ecosystem/_index.md 
b/landing-pages/site/content/en/ecosystem/_index.md
index 81b26b8054..c0d8725926 100644
--- a/landing-pages/site/content/en/ecosystem/_index.md
+++ b/landing-pages/site/content/en/ecosystem/_index.md
@@ -209,7 +209,7 @@ Apache Airflow releases the [Official Apache Airflow 
Community Chart](https://ai
 
 [airflow-config](https://github.com/airflow-laminar/airflow-config) - 
[Pydantic](https://pydantic.dev) / [Hydra](https://hydra.cc) based 
configuration system for DAG and Task arguments
 
-[airflow-priority](https://github.com/airflow-laminar/airflow-priority) - 
Priority Tags (P1, P2, etc) for Airflow DAGs with automated alerting to 
Datadog, New Relic, Slack, Discord, and more
+[airflow-priority](https://github.com/airflow-laminar/airflow-priority) - 
Priority Tags (P1, P2, etc.) for Airflow DAGs with automated alerting to 
Datadog, New Relic, Slack, Discord, and more
 
 [airflow-ha](https://github.com/airflow-laminar/airflow-ha) - High 
Availability (HA) DAG Utility
 

Reply via email to