This is an automated email from the ASF dual-hosted git repository.
potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git
The following commit(s) were added to refs/heads/main by this push:
new f8a775b6e67 replace usage of DAG with dag in main page and minor
spelling correction (#49429)
f8a775b6e67 is described below
commit f8a775b6e678e996711b7b91fc919768c532847a
Author: Kalyan R <[email protected]>
AuthorDate: Fri Apr 18 16:11:23 2025 +0530
replace usage of DAG with dag in main page and minor spelling correction
(#49429)
---
airflow-core/docs/index.rst | 14 +++++++-------
1 file changed, 7 insertions(+), 7 deletions(-)
diff --git a/airflow-core/docs/index.rst b/airflow-core/docs/index.rst
index a7a21e671c9..d6773d30557 100644
--- a/airflow-core/docs/index.rst
+++ b/airflow-core/docs/index.rst
@@ -71,16 +71,16 @@ Airflow parses the script, schedules the tasks, and
executes them in the defined
is displayed in the web interface:
.. image:: /img/ui-dark/demo_graph_and_code_view.png
- :alt: Demo DAG in the Graph View, showing the status of one DAG run along
with DAG code.
+ :alt: Demo dag in the Graph View, showing the status of one dag run along
with dag code.
|
-This examples uses a simple Bash command and Python function, but Airflow
tasks can run virtually any code. You might use
+This example uses a simple Bash command and Python function, but Airflow tasks
can run virtually any code. You might use
tasks to run a Spark job, move files between storage buckets, or send a
notification email. Here's what that same dag looks
like over time, with multiple runs:
.. image:: /img/ui-dark/demo_grid_view_with_task_logs.png
- :alt: Demo DAG in the Grid View, showing the status of all DAG runs, as well
as logs for a task instance
+ :alt: Demo dag in the Grid View, showing the status of all dag runs, as well
as logs for a task instance
|
@@ -88,7 +88,7 @@ Each column in the grid represents a single dag run. While
the graph and grid vi
several other views to help you monitor and troubleshoot workflows — such as
the ``DAG Overview`` view:
.. image:: /img/ui-dark/demo_dag_overview_with_failed_tasks.png
- :alt: Overview of a complex DAG in the Grid View, showing the status of all
DAG runs, as well as quick links to recently failed task logs
+ :alt: Overview of a complex dag in the Grid View, showing the status of all
dag runs, as well as quick links to recently failed task logs
|
@@ -101,7 +101,7 @@ Why Airflow®?
Airflow is a platform for orchestrating batch workflows. It offers a flexible
framework with a wide range of built-in operators
and makes it easy to integrate with new technologies.
-If your workflows have a clear start and end and run on a schedule, they're a
great fit for Airflow DAGs.
+If your workflows have a clear start and end and run on a schedule, they're a
great fit for Airflow dags.
If you prefer coding over clicking, Airflow is built for you. Defining
workflows as Python code provides several key benefits:
@@ -111,7 +111,7 @@ If you prefer coding over clicking, Airflow is built for
you. Defining workflows
- **Extensibility**: Customize workflows using a large ecosystem of existing
components — or build your own.
Airflow's rich scheduling and execution semantics make it easy to define
complex, recurring pipelines. From the web interface,
-you can manually trigger DAGs, inspect logs, and monitor task status. You can
also backfill DAG runs to process historical
+you can manually trigger dags, inspect logs, and monitor task status. You can
also backfill dag runs to process historical
data, or rerun only failed tasks to minimize cost and time.
The Airflow platform is highly customizable. With the
:doc:`public-airflow-interface` you can extend and adapt nearly
@@ -124,7 +124,7 @@ others via the `community
<https://airflow.apache.org/community>`_, `Slack <http
Why not Airflow®?
=================
-Airflow® is designed for finite, batch-oriented workflows. While you can
trigger DAGs using the CLI or REST API, Airflow is not
+Airflow® is designed for finite, batch-oriented workflows. While you can
trigger dags using the CLI or REST API, Airflow is not
intended for continuously running, event-driven, or streaming workloads. That
said, Airflow often complements streaming systems like Apache Kafka.
Kafka handles real-time ingestion, writing data to storage. Airflow can then
periodically pick up that data and process it in batch.