This is an automated email from the ASF dual-hosted git repository. weilee pushed a commit to branch main in repository https://gitbox.apache.org/repos/asf/airflow.git
The following commit(s) were added to refs/heads/main by this push: new ca85da81c4a docs: Correct TaskFlow capitalization in documentation (#51794) ca85da81c4a is described below commit ca85da81c4a828a2d887906bd2cbf8d01db88dab Author: Wei-Yu Chen <60685934+jasper...@users.noreply.github.com> AuthorDate: Sun Jul 6 19:41:30 2025 +0800 docs: Correct TaskFlow capitalization in documentation (#51794) --- airflow-core/docs/best-practices.rst | 8 ++++---- airflow-core/docs/tutorial/fundamentals.rst | 2 +- airflow-core/docs/tutorial/objectstorage.rst | 4 ++-- providers/docker/docs/changelog.rst | 4 ++-- providers/sftp/docs/sensors/sftp_sensor.rst | 2 +- providers/standard/docs/changelog.rst | 2 +- 6 files changed, 11 insertions(+), 11 deletions(-) diff --git a/airflow-core/docs/best-practices.rst b/airflow-core/docs/best-practices.rst index 28c3285339a..1383ce9a3a5 100644 --- a/airflow-core/docs/best-practices.rst +++ b/airflow-core/docs/best-practices.rst @@ -1010,7 +1010,7 @@ There are certain limitations and overhead introduced by this operator: same worker might be affected by previous tasks creating/modifying files etc. You can see detailed examples of using :class:`airflow.providers.standard.operators.python.PythonVirtualenvOperator` in -:ref:`this section in the Taskflow API tutorial <taskflow-dynamically-created-virtualenv>`. +:ref:`this section in the TaskFlow API tutorial <taskflow-dynamically-created-virtualenv>`. Using ExternalPythonOperator @@ -1078,7 +1078,7 @@ The nice thing about this is that you can switch the decorator back at any time developing it "dynamically" with ``PythonVirtualenvOperator``. You can see detailed examples of using :class:`airflow.providers.standard.operators.python.ExternalPythonOperator` in -:ref:`Taskflow External Python example <taskflow-external-python-environment>` +:ref:`TaskFlow External Python example <taskflow-external-python-environment>` Using DockerOperator or Kubernetes Pod Operator ----------------------------------------------- @@ -1142,9 +1142,9 @@ The drawbacks: containers etc. in order to author a DAG that uses those operators. You can see detailed examples of using :class:`airflow.operators.providers.Docker` in -:ref:`Taskflow Docker example <taskflow-docker_environment>` +:ref:`TaskFlow Docker example <taskflow-docker_environment>` and :class:`airflow.providers.cncf.kubernetes.operators.pod.KubernetesPodOperator` -:ref:`Taskflow Kubernetes example <tasfklow-kpo>` +:ref:`TaskFlow Kubernetes example <tasfklow-kpo>` Using multiple Docker Images and Celery Queues ---------------------------------------------- diff --git a/airflow-core/docs/tutorial/fundamentals.rst b/airflow-core/docs/tutorial/fundamentals.rst index 20c93e27376..1cce9839315 100644 --- a/airflow-core/docs/tutorial/fundamentals.rst +++ b/airflow-core/docs/tutorial/fundamentals.rst @@ -90,7 +90,7 @@ Next, we'll need to create a DAG object to house our tasks. We'll provide a uniq Understanding Operators ----------------------- An operator represents a unit of work in Airflow. They are the building blocks of your workflows, allowing you to -define what tasks will be executed. While we can use operators for many tasks, Airflow also offers the :doc:`Taskflow API <taskflow>` +define what tasks will be executed. While we can use operators for many tasks, Airflow also offers the :doc:`TaskFlow API <taskflow>` for a more Pythonic way to define workflows, which we'll touch on later. All operators derive from the ``BaseOperator``, which includes the essential arguments needed to run tasks in Airflow. diff --git a/airflow-core/docs/tutorial/objectstorage.rst b/airflow-core/docs/tutorial/objectstorage.rst index 59c4142f7ce..de80e8c973c 100644 --- a/airflow-core/docs/tutorial/objectstorage.rst +++ b/airflow-core/docs/tutorial/objectstorage.rst @@ -23,7 +23,7 @@ Cloud-Native Workflows with Object Storage .. versionadded:: 2.8 -Welcome to the final tutorial in our Airflow series! By now, you've built DAGs with Python and the Taskflow API, passed +Welcome to the final tutorial in our Airflow series! By now, you've built DAGs with Python and the TaskFlow API, passed data with XComs, and chained tasks together into clear, reusable workflows. In this tutorial we'll take it a step further by introducing the **Object Storage API**. This API makes it easier to @@ -108,7 +108,7 @@ Here's what's happening: - We generate a filename based on the task's logical date - Using ``ObjectStoragePath``, we write the data directly to cloud storage as Parquet -This is a classic Taskflow pattern. The object key changes each day, allowing us to run this daily and build a dataset +This is a classic TaskFlow pattern. The object key changes each day, allowing us to run this daily and build a dataset over time. We return the final object path to be used in the next task. Why this is cool: No boto3, no GCS client setup, no credentials juggling. Just simple file semantics that work across diff --git a/providers/docker/docs/changelog.rst b/providers/docker/docs/changelog.rst index 36c5097830e..f09ed4c66cd 100644 --- a/providers/docker/docs/changelog.rst +++ b/providers/docker/docs/changelog.rst @@ -804,7 +804,7 @@ Other Features ~~~~~~~~ -* ``Add a Docker Taskflow decorator (#15330)`` +* ``Add a Docker TaskFlow decorator (#15330)`` This version of Docker Provider has a new feature - TaskFlow decorator that only works in Airflow 2.2. If you try to use the decorator in pre-Airflow 2.2 version you will get an error: @@ -900,7 +900,7 @@ Features ~~~~~~~~ * ``Entrypoint support in docker operator (#14642)`` -* ``Add PythonVirtualenvDecorator to Taskflow API (#14761)`` +* ``Add PythonVirtualenvDecorator to TaskFlow API (#14761)`` * ``Support all terminus task states in Docker Swarm Operator (#14960)`` diff --git a/providers/sftp/docs/sensors/sftp_sensor.rst b/providers/sftp/docs/sensors/sftp_sensor.rst index 7054d3110dc..d5549ac2aba 100644 --- a/providers/sftp/docs/sensors/sftp_sensor.rst +++ b/providers/sftp/docs/sensors/sftp_sensor.rst @@ -28,7 +28,7 @@ To get more information about this sensor visit :class:`~airflow.providers.sftp. :end-before: [END howto_operator_sftp_sensor] -We can also use Taskflow API. It takes the same arguments as the :class:`~airflow.providers.sftp.sensors.sftp.SFTPSensor` along with - +We can also use TaskFlow API. It takes the same arguments as the :class:`~airflow.providers.sftp.sensors.sftp.SFTPSensor` along with - op_args (optional) A list of positional arguments that will get unpacked when diff --git a/providers/standard/docs/changelog.rst b/providers/standard/docs/changelog.rst index bcbba97c71e..b995fd4edb2 100644 --- a/providers/standard/docs/changelog.rst +++ b/providers/standard/docs/changelog.rst @@ -314,7 +314,7 @@ Misc * ``AIP-72: Move non-user facing code to '_internal' (#45515)`` * ``AIP-72: Add support for 'get_current_context' in Task SDK (#45486)`` * ``Move Literal alias into TYPE_CHECKING block (#45345)`` -* ``AIP-72: Add Taskflow API support & template rendering in Task SDK (#45444)`` +* ``AIP-72: Add TaskFlow API support & template rendering in Task SDK (#45444)`` * ``Remove tuple_in_condition helpers (#45201)`` .. Below changes are excluded from the changelog. Move them to