phanikumv commented on code in PR #26038:
URL: https://github.com/apache/airflow/pull/26038#discussion_r958112346


##########
docs/apache-airflow-providers-microsoft-azure/operators/azure_synapse.rst:
##########
@@ -0,0 +1,50 @@
+
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Azure Synapse Operators
+=======================
+Azure Synapse Analytics is a limitless analytics service that brings together 
data integration,
+enterprise data warehousing and big data analytics. It gives you the freedom 
to query data on your terms,
+using either serverless or dedicated options—at scale.
+Azure Synapse brings these worlds together with a unified experience to ingest,
+explore, prepare, transform, manage and serve data for immediate BI and 
machine learning needs.
+
+.. _howto/operator:AzureSynapseRunSparkBatchOperator:
+
+AzureSynapseRunSparkBatchOperator
+-----------------------------------
+Use the 
:class:`~airflow.providers.microsoft.azure.operators.synapse.AzureSynapseRunSparkBatchOperator`
 to execute a
+spark application within Synapse Analytics.
+By default, the operator will periodically check on the status of the executed 
Spark job to
+terminate with a "Succeeded" status.
+
+Below is an example of using this operator to execute a Spark application on 
Azure Synapse.
+
+  .. exampleinclude:: 
/../../tests/system/providers/microsoft/azure/example_azure_synapse.py
+      :language: python
+      :dedent: 0

Review Comment:
   done



##########
docs/apache-airflow-providers-microsoft-azure/connections/azure_synapse.rst:
##########
@@ -0,0 +1,70 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+
+
+.. _howto/connection:synapse:
+
+Microsoft Azure Synapse
+=======================
+
+The Microsoft Azure Synapse connection type enables the Azure Synapse 
Integrations.
+
+Authenticating to Azure Synapse
+-------------------------------
+
+There are multiple ways to connect to Azure Synapse using Airflow.
+
+1. Use `token credentials
+   
<https://docs.microsoft.com/en-us/azure/developer/python/azure-sdk-authenticate?tabs=cmd#authenticate-with-token-credentials>`_
+   i.e. add specific credentials (client_id, secret, tenant) and subscription 
id to the Airflow connection.
+2. Fallback on `DefaultAzureCredential
+   
<https://docs.microsoft.com/en-us/python/api/overview/azure/identity-readme?view=azure-python#defaultazurecredential>`_.
+   This includes a mechanism to try different options to authenticate: Managed 
System Identity, environment variables, authentication through Azure CLI...
+
+Default Connection IDs
+----------------------
+
+All hooks and operators related to Microsoft Azure Data Factory use 
``azure_synapse_default`` by default.
+
+Configuring the Connection
+--------------------------
+
+Client ID
+    Specify the ``client_id`` used for the initial connection.
+    This is needed for *token credentials* authentication mechanism.
+    It can be left out to fall back on ``DefaultAzureCredential``.
+
+Secret
+    Specify the ``secret`` used for the initial connection.
+    This is needed for *token credentials* authentication mechanism.
+    It can be left out to fall back on ``DefaultAzureCredential``.
+
+Tenant ID
+    Specify the Azure tenant ID used for the initial connection.
+    This is needed for *token credentials* authentication mechanism.
+    It can be left out to fall back on ``DefaultAzureCredential``.
+    Use the key ``extra__azure_synapse__tenantId`` to pass in the tenant ID.
+
+Subscription ID

Review Comment:
   done



##########
tests/system/providers/microsoft/azure/example_azure_synapse.py:
##########
@@ -0,0 +1,72 @@
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+import os
+from datetime import datetime, timedelta
+
+from airflow import DAG
+from airflow.providers.microsoft.azure.operators.synapse import 
AzureSynapseRunSparkBatchOperator
+
+AIRFLOW_HOME = os.getenv("AIRFLOW_HOME", "/usr/local/airflow")
+EXECUTION_TIMEOUT = int(os.getenv("EXECUTION_TIMEOUT", 6))
+
+default_args = {
+    "execution_timeout": timedelta(hours=EXECUTION_TIMEOUT),
+    "retries": int(os.getenv("DEFAULT_TASK_RETRIES", 2)),
+    "retry_delay": 
timedelta(seconds=int(os.getenv("DEFAULT_RETRY_DELAY_SECONDS", 60))),
+}
+
+SPARK_JOB_PAYLOAD = {
+    "name": "SparkJob",
+    "file": 
"abfss://[email protected]/wordcount.py",
+    "args": [
+        
"abfss://[email protected]/shakespeare.txt",
+        "abfss://[email protected]/results/",
+    ],
+    "jars": [],
+    "pyFiles": [],
+    "files": [],
+    "conf": {
+        "spark.dynamicAllocation.enabled": "false",
+        "spark.dynamicAllocation.minExecutors": "1",
+        "spark.dynamicAllocation.maxExecutors": "2",
+    },
+    "numExecutors": 2,
+    "executorCores": 4,
+    "executorMemory": "28g",
+    "driverCores": 4,
+    "driverMemory": "28g",
+}
+
+with DAG(
+    dag_id="example_synapse_spark_job",
+    start_date=datetime(2022, 1, 1),
+    schedule_interval=None,

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to