This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new d24527bf75 Add documentation about cli `add connection` and AWS 
connection URI (#28852)
d24527bf75 is described below

commit d24527bf759c80dd22684a0fb51c283bbafb9298
Author: Arkadiusz Rudny <[email protected]>
AuthorDate: Mon Jan 16 18:26:15 2023 +0100

    Add documentation about cli `add connection` and AWS connection URI (#28852)
---
 .../connections/aws.rst                              | 12 ++++++++++++
 .../logging/s3-task-handler.rst                      | 20 ++++++++++++++------
 2 files changed, 26 insertions(+), 6 deletions(-)

diff --git a/docs/apache-airflow-providers-amazon/connections/aws.rst 
b/docs/apache-airflow-providers-amazon/connections/aws.rst
index 14956d1de1..e885887980 100644
--- a/docs/apache-airflow-providers-amazon/connections/aws.rst
+++ b/docs/apache-airflow-providers-amazon/connections/aws.rst
@@ -146,6 +146,18 @@ Snippet to create Connection and convert to URI
     os.environ[env_key] = conn_uri
     print(conn.test_connection())
 
+
+  .. warning:: When using the Airflow CLI, a ``@`` may need to be added when:
+
+    - login
+    - password
+    - host
+    - port
+
+    are not given, see example below. This is a known airflow limitation.
+
+    ``airflow connections add aws_conn --conn-uri 
aws://@/?region_name=eu-west-1``
+
 Using instance profile
 ^^^^^^^^^^^^^^^^^^^^^^
 
diff --git a/docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst 
b/docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst
index 8352432c22..016c2d5165 100644
--- a/docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst
+++ b/docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst
@@ -113,17 +113,25 @@ We are  using the existing ``serviceAccount`` hence 
``create: false`` with exist
         delete_worker_pods: 'False'
         encrypt_s3_logs: 'True'
 
-Step3: Create Amazon Web Services connection in Airflow Web UI
+Step3: Create Amazon Web Services connection
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 With the above configurations, Webserver and Worker Pods can access Amazon S3 
bucket and write logs without using any Access Key and Secret Key or Instance 
profile credentials.
 
-The final step to create connections under Airflow UI before executing the 
DAGs.
+- Using Airflow Web UI
 
-* Login to Airflow Web UI with ``admin`` credentials and Navigate to ``Admin 
-> Connections``
-* Create connection for ``Amazon Web Services`` and select the 
options(Connection ID and Connection Type) as shown in the image.
-* Select the correct region where S3 bucket is created in ``Extra`` text box.
+  The final step to create connections under Airflow UI before executing the 
DAGs.
 
-.. image:: /img/aws-base-conn-airflow.png
+  * Login to Airflow Web UI with ``admin`` credentials and Navigate to ``Admin 
-> Connections``
+  * Create connection for ``Amazon Web Services`` and select the options 
(Connection ID and Connection Type) as shown in the image.
+  * Select the correct region where S3 bucket is created in ``Extra`` text box.
+
+  .. image:: /img/aws-base-conn-airflow.png
+
+- Using Airflow CLI
+
+  ``airflow connections add aws_conn --conn-uri aws://@/?egion_name=eu-west-1``
+
+  Note that ``@`` used in ``-conn-uri`` parameter usually separates password 
and host but in this case it complies with uri validator used.
 
 Step4: Verify the logs
 ~~~~~~~~~~~~~~~~~~~~~~

Reply via email to