This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow-site.git


The following commit(s) were added to refs/heads/main by this push:
     new a15732faf3 Fixing some typos in various texts (#1254)
a15732faf3 is described below

commit a15732faf3162b25a5b788c8503b6120417ba14b
Author: Didier Durand <[email protected]>
AuthorDate: Sat Oct 25 08:44:38 2025 +0200

    Fixing some typos in various texts (#1254)
    
    * Fixing some typos in various texts
    
    * Apply suggestion from @potiuk
    
    ---------
    
    Co-authored-by: Jarek Potiuk <[email protected]>
---
 landing-pages/site/config.toml                                          | 2 +-
 landing-pages/site/content/en/blog/airflow-2.4.0/index.md               | 2 +-
 landing-pages/site/content/en/blog/airflow-2.5.0/index.md               | 2 +-
 .../site/content/en/blog/airflow-three-point-oh-is-here/index.md        | 2 +-
 .../experience-in-google-season-of-docs-2019-with-apache-airflow.md     | 2 +-
 landing-pages/site/content/en/blog/introducing_setup_teardown/index.md  | 2 +-
 landing-pages/site/content/en/use-cases/onefootball.md                  | 2 +-
 landing-pages/site/content/en/use-cases/snapp.md                        | 2 +-
 landing-pages/site/layouts/partials/head-css.html                       | 2 +-
 9 files changed, 9 insertions(+), 9 deletions(-)

diff --git a/landing-pages/site/config.toml b/landing-pages/site/config.toml
index 436b7e7381..50b2e8ad33 100644
--- a/landing-pages/site/config.toml
+++ b/landing-pages/site/config.toml
@@ -93,7 +93,7 @@ github_project_repo = "https://github.com/apache/airflow";
 
 # User interface configuration
 [params.ui]
-# Enable to show the side bar menu in its compact state.
+# Enable to show the sidebar menu in its compact state.
 sidebar_menu_compact = false
 #  Set to true to disable breadcrumb navigation.
 breadcrumb_disable = false
diff --git a/landing-pages/site/content/en/blog/airflow-2.4.0/index.md 
b/landing-pages/site/content/en/blog/airflow-2.4.0/index.md
index 2d8e856dfb..8a1a6c699b 100644
--- a/landing-pages/site/content/en/blog/airflow-2.4.0/index.md
+++ b/landing-pages/site/content/en/blog/airflow-2.4.0/index.md
@@ -76,7 +76,7 @@ def my_task(data_interval_start, data_interval_env)
     ...
 ```
 
-There are a few subtlties as to what you need installed in the virtual env 
depending on which context variables you access, so be sure to read the [how-to 
on using the ExternalPythonOperator][howto-externalpythonop]
+There are a few subtleties as to what you need installed in the virtual env 
depending on which context variables you access, so be sure to read the [how-to 
on using the ExternalPythonOperator][howto-externalpythonop]
 
 [howto-externalpythonop]: 
http://airflow.apache.org/docs/apache-airflow/2.4.0/howto/operator/python.html#externalpythonoperator
 
diff --git a/landing-pages/site/content/en/blog/airflow-2.5.0/index.md 
b/landing-pages/site/content/en/blog/airflow-2.5.0/index.md
index 96f64508e3..81c5f55d2c 100644
--- a/landing-pages/site/content/en/blog/airflow-2.5.0/index.md
+++ b/landing-pages/site/content/en/blog/airflow-2.5.0/index.md
@@ -41,7 +41,7 @@ c. Everything runs in one process, so you can put a 
breakpoint in your IDE, and
 
 Hopefully the headline says enough. It's lovely, go check it out.
 
-## More improvments to Dynamic-Task mapping
+## More improvements to Dynamic-Task mapping
 
 In a similar vein to the improvements to the Dataset (UI), we have continued 
to iterate on and improve the feature we first added in Airflow 2.3, Dynamic 
Task Mapping, and 2.5 includes [dozens of 
improvements](https://github.com/apache/airflow/pulls?q=is%3Apr+author%3Auranusjr+is%3Aclosed+milestone%3A%22Airflow+2.5.0%22).
 
diff --git 
a/landing-pages/site/content/en/blog/airflow-three-point-oh-is-here/index.md 
b/landing-pages/site/content/en/blog/airflow-three-point-oh-is-here/index.md
index b12366f67a..12df7f72d3 100644
--- a/landing-pages/site/content/en/blog/airflow-three-point-oh-is-here/index.md
+++ b/landing-pages/site/content/en/blog/airflow-three-point-oh-is-here/index.md
@@ -67,7 +67,7 @@ The fundamental evolution of Datasets into Data Assets has 
been done as part of
 
 External event driven scheduling 
([AIP-82](https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-82+External+event+driven+scheduling+in+Airflow))
 is based on the foundational Data Assets work described above, specifically 
Watchers. The initial scope as defined in the AIP is complete and now 
incorporates a “Common Message Bus” interface. This release also includes an 
implementation of the above for AWS SQS as an “out of the box” integration, 
which demonstrates DAGs being triggered upon  [...]
 
-### Inference execution and hyper-parameter tuning
+### Inference execution and hyperparameter tuning
 
 Many ML and AI Engineers are already using Airflow for ML/AI Ops, especially 
for model training. However, there were challenges for Inference Execution. 
Enhancing Airflow for Inference Execution by adding support for 
non-data-interval-Dags (sorry, that’s a mouthful) is an important change. This 
work is covered as part of “Remove Execution date unique constraint from DAG 
run” 
([AIP-83](https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-83+Remove+Execution+Date+Unique+Constraint+from+
 [...]
 
diff --git 
a/landing-pages/site/content/en/blog/experience-in-google-season-of-docs-2019-with-apache-airflow.md
 
b/landing-pages/site/content/en/blog/experience-in-google-season-of-docs-2019-with-apache-airflow.md
index dd7174c6d7..c23aec0408 100644
--- 
a/landing-pages/site/content/en/blog/experience-in-google-season-of-docs-2019-with-apache-airflow.md
+++ 
b/landing-pages/site/content/en/blog/experience-in-google-season-of-docs-2019-with-apache-airflow.md
@@ -31,7 +31,7 @@ Considering the total experience, I decided to go with the 
Airflow.
 After selecting the org, the next step was to choose the project. Again, my 
previous experience played a role here, and I ended up picking the **How to 
create a workflow** . The aim of the project was to write documentation which 
will help users in creating complex as well as custom DAGs.
 The final deliverables were a bit different, though. More on that later.
 
-After submitting my application, I got involved in my job until one day, I saw 
a mail from google confirming my selection as a Technical Writer for the 
project.
+After submitting my application, I got involved in my job until one day, I saw 
a mail from Google confirming my selection as a Technical Writer for the 
project.
 
 ## Community Bonding
 Getting selected is just a beginning.  I got the invite to the Airflow Slack 
channel where most of the discussions happened.
diff --git 
a/landing-pages/site/content/en/blog/introducing_setup_teardown/index.md 
b/landing-pages/site/content/en/blog/introducing_setup_teardown/index.md
index 5651f4bf54..0ede1af155 100644
--- a/landing-pages/site/content/en/blog/introducing_setup_teardown/index.md
+++ b/landing-pages/site/content/en/blog/introducing_setup_teardown/index.md
@@ -57,7 +57,7 @@ with TaskGroup("load") as load:
 do_emr >> load
 ```
 
-In this code, each group has a teardown, and we just arrow the first group to 
the second. As advertised, `delete_cluster`, a teardown task, is ignored. This 
has two important consequences: one, even if it fails, the `load` group will 
still run; and two, `delete_cluster` and `create_configuration` can run in 
parallel (generally speaking, we’d imagine you don’t want to wait for teardown 
operations to complete before continuing onto other tasks in the dag). Of 
course you can override this b [...]
+In this code, each group has a teardown, and we just arrow the first group to 
the second. As advertised, `delete_cluster`, a teardown task, is ignored. This 
has two important consequences: one, even if it fails, the `load` group will 
still run; and two, `delete_cluster` and `create_configuration` can run in 
parallel (generally speaking, we’d imagine you don’t want to wait for teardown 
operations to complete before continuing onto other tasks in the dag). Of 
course, you can override this  [...]
 
 ## Conclusion
 
diff --git a/landing-pages/site/content/en/use-cases/onefootball.md 
b/landing-pages/site/content/en/use-cases/onefootball.md
index f6440c3df2..71ca753b85 100644
--- a/landing-pages/site/content/en/use-cases/onefootball.md
+++ b/landing-pages/site/content/en/use-cases/onefootball.md
@@ -19,4 +19,4 @@ Airflow had been on our radar for a while until one day we 
took the leap. We use
 We have DAGs orchestrating SQL transformations in our data warehouse, but also 
DAGs that are orchestrating functions ran against our Kubernetes cluster both 
for training Machine Learning models and sending daily analytics emails.
 
 ##### What are the results?
-The learning curve was steep but in about 100 days we were able to efficiently 
use Airflow to manage the complexity of our data engineering. We currently have 
17 DAGs (adding on average 1 per week), we have 2 contributions on 
apache/airflow, we have 7 internal hooks and operators and are planning to add 
more as our migration efforts continue.
+The learning curve was steep but in about 100 days we were able to efficiently 
use Airflow to manage the complexity of our data engineering. We currently have 
17 DAGs (adding on average 1 per week), we have 2 contributions to 
apache/airflow, we have 7 internal hooks and operators and are planning to add 
more as our migration efforts continue.
diff --git a/landing-pages/site/content/en/use-cases/snapp.md 
b/landing-pages/site/content/en/use-cases/snapp.md
index c4fa2a2e4c..e085518d17 100644
--- a/landing-pages/site/content/en/use-cases/snapp.md
+++ b/landing-pages/site/content/en/use-cases/snapp.md
@@ -16,7 +16,7 @@ To address this challenge and streamline our operations, we 
recognized the need
 By leveraging Airflow, we aim to automate our critical tasks, enabling us to 
execute them more efficiently and effectively. This automation will not only 
enhance our productivity but also provide us with greater control and 
visibility over our workflows. With Airflow's robust features and flexibility, 
we are confident that it will significantly improve our team's performance and 
contribute to the continued success of Snapp.
 
 ##### How did Apache Airflow help to solve this problem?
-After implementing Apache Airflow on our cloud platform, specifically 
utilizing the KubernetesExecutor, we experienced a significant improvement in 
our task management capabilities. With Airflow, each sub-team within the Map 
team was able to create and manage their own DAGs, automating various tasks 
seamlessly. This included essential procedures such as data updates, model 
training pipelines, and project deployments, leveraging the 
SparkKubernetesOperator and other relevant tools.
+After implementing Apache Airflow on our cloud platform, specifically 
utilizing the KubernetesExecutor, we experienced a significant improvement in 
our task management capabilities. With Airflow, each subteam within the Map 
team was able to create and manage their own DAGs, automating various tasks 
seamlessly. This included essential procedures such as data updates, model 
training pipelines, and project deployments, leveraging the 
SparkKubernetesOperator and other relevant tools.
 
 One notable example of Airflow's impact was the creation of a DAG specifically 
designed to update the traffic congestion colorization for our streets. This 
DAG runs every 10 minutes, ensuring that our congestion data remains up-to-date 
and accurate. The intuitive Airflow UI also proved to be invaluable, as it 
enabled our non-technical teammates to easily work with DAGs and monitor their 
progress.
 
diff --git a/landing-pages/site/layouts/partials/head-css.html 
b/landing-pages/site/layouts/partials/head-css.html
index de28523002..738714bdc7 100644
--- a/landing-pages/site/layouts/partials/head-css.html
+++ b/landing-pages/site/layouts/partials/head-css.html
@@ -18,7 +18,7 @@
 */}}
 {{ $scssMain := "scss/main.scss"}}
 {{ if hugo.IsServer }}
-{{/* Note the missing postCSS. This makes it snappier to develop in Chrome, 
but makes it look sub-optimal in other browsers. */}}
+{{/* Note the missing postCSS. This makes it snappier to develop in Chrome, 
but makes it look suboptimal in other browsers. */}}
 {{ $css := resources.Get $scssMain | toCSS (dict "enableSourceMap" true) }}
 <link href="{{ $css.RelPermalink }}" rel="stylesheet">
 {{ else }}

Reply via email to