This is an automated email from the ASF dual-hosted git repository.

kaxilnaik pushed a commit to branch staging
in repository https://gitbox.apache.org/repos/asf/airflow-site.git

commit f54744fb76f0739dc6695b373cda241ec2e23156
Author: Didier Durand <[email protected]>
AuthorDate: Mon Nov 24 16:02:41 2025 +0100

    [Doc] fixing 404 error for incorrect ETL link (#1265)
    
    * [Doc] fixing 404 error for incorrect ETL link
    
    * [Doc] Removed the link to astronomer site
---
 landing-pages/site/content/en/use-cases/etl_analytics.md | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/landing-pages/site/content/en/use-cases/etl_analytics.md 
b/landing-pages/site/content/en/use-cases/etl_analytics.md
index c578c3e22c..ea94f2a53a 100644
--- a/landing-pages/site/content/en/use-cases/etl_analytics.md
+++ b/landing-pages/site/content/en/use-cases/etl_analytics.md
@@ -13,9 +13,9 @@ blocktype: use-case
 
 </div>
 
-Extract-Transform-Load (ETL) and Extract-Load-Transform (ELT) data pipelines 
are the most common use case for Apache Airflow. 90% of respondents in the 2023 
Apache Airflow survey are using Airflow for ETL/ELT to power analytics use 
cases.  
+Extract-Transform-Load (ETL) and Extract-Load-Transform (ELT) data pipelines 
are the most common use case for Apache Airflow. 90% of respondents in the 2023 
Apache Airflow survey are using Airflow for ETL/ELT to power analytics use 
cases.
 
-The video below shows a simple ETL/ELT pipeline in Airflow that extracts 
climate data from a CSV file, as well as weather data from an API, runs 
transformations and then loads the results into a database to power a 
dashboard. You can find the code for this example 
[here](https://github.com/astronomer/airflow-quickstart).
+The video below shows a simple ETL/ELT pipeline in Airflow that extracts 
climate data from a CSV file, as well as weather data from an API, runs 
transformations and then loads the results into a database to power a dashboard.
 
 
 <div id="videoContainer" style="display: flex; justify-content: center; 
align-items: center; border: 2px solid #ccc; width: 75%; margin: auto; padding: 
20px;">
@@ -33,7 +33,7 @@ Airflow is the de-facto standard for defining ETL/ELT 
pipelines as Python code.
 - **Tool agnostic**: Airflow can be used to orchestrate ETL/ELT pipelines for 
any data source or destination.
 - **Extensible**: There are many Airflow modules available to connect to any 
data source or destination, and you can write your own custom operators and 
hooks for specific use cases.
 - **Dynamic**: In Airflow you can define [dynamic 
tasks](https://airflow.apache.org/docs/apache-airflow/stable/authoring-and-scheduling/dynamic-task-mapping.html),
 which serve as placeholders to adapt at runtime based on changing input.
-- **Scalable**: Airflow can be scaled to handle infinite numbers of tasks and 
workflows, given enough computing power.  
+- **Scalable**: Airflow can be scaled to handle infinite numbers of tasks and 
workflows, given enough computing power.
 
 
 ## Airflow features for ETL/ELT pipelines

Reply via email to