This is an automated email from the ASF dual-hosted git repository.

eladkal pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new 687977f8f3 docs(providers): replace markdown style link with rst style 
link for amazon and apache-beam (#33992)
687977f8f3 is described below

commit 687977f8f32f013eafda27e3b5caee1e50eacbed
Author: Wei Lee <[email protected]>
AuthorDate: Fri Sep 1 14:07:34 2023 +0800

    docs(providers): replace markdown style link with rst style link for amazon 
and apache-beam (#33992)
---
 docs/apache-airflow-providers-amazon/operators/s3/s3.rst | 2 +-
 docs/apache-airflow-providers-apache-beam/operators.rst  | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/apache-airflow-providers-amazon/operators/s3/s3.rst 
b/docs/apache-airflow-providers-amazon/operators/s3/s3.rst
index 4d83416761..516b171244 100644
--- a/docs/apache-airflow-providers-amazon/operators/s3/s3.rst
+++ b/docs/apache-airflow-providers-amazon/operators/s3/s3.rst
@@ -150,7 +150,7 @@ Transform an Amazon S3 object
 
 To transform the data from one Amazon S3 object and save it to another object 
you can use
 :class:`~airflow.providers.amazon.aws.operators.s3.S3FileTransformOperator`.
-You can also apply an optional [Amazon S3 Select 
expression](https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-glacier-select-sql-reference-select.html)
+You can also apply an optional `Amazon S3 Select expression 
<https://docs.aws.amazon.com/AmazonS3/latest/userguide/s3-glacier-select-sql-reference-select.html>`_
 to select the data you want to retrieve from ``source_s3_key`` using 
``select_expression``.
 
 .. exampleinclude:: /../../tests/system/providers/amazon/aws/example_s3.py
diff --git a/docs/apache-airflow-providers-apache-beam/operators.rst 
b/docs/apache-airflow-providers-apache-beam/operators.rst
index 3ec91a050e..b536c514f7 100644
--- a/docs/apache-airflow-providers-apache-beam/operators.rst
+++ b/docs/apache-airflow-providers-apache-beam/operators.rst
@@ -27,7 +27,7 @@ back-ends, which include Apache Flink, Apache Spark, and 
Google Cloud Dataflow.
 .. note::
     This operator requires ``gcloud`` command (Google Cloud SDK) to be 
installed on the Airflow worker
     <https://cloud.google.com/sdk/docs/install> when the Apache Beam pipeline 
runs on the
-    [Dataflow service](https://cloud.google.com/dataflow/docs).
+    `Dataflow service <https://cloud.google.com/dataflow/docs>`_.
 
 
 .. _howto/operator:BeamRunPythonPipelineOperator:

Reply via email to