florentyu opened a new issue, #30504:
URL: https://github.com/apache/airflow/issues/30504

   ### Apache Airflow Provider(s)
   
   microsoft-azure
   
   ### Versions of Apache Airflow Providers
   
   ```
   apache-airflow-providers-amazon==7.4.0
   apache-airflow-providers-common-sql==1.4.0
   apache-airflow-providers-elasticsearch==4.4.0
   apache-airflow-providers-ftp==3.3.1
   apache-airflow-providers-http==4.3.0
   apache-airflow-providers-imap==3.1.1
   apache-airflow-providers-microsoft-azure==5.3.0
   apache-airflow-providers-microsoft-mssql==3.3.2
   apache-airflow-providers-microsoft-winrm==3.1.1
   apache-airflow-providers-odbc==3.2.1
   apache-airflow-providers-postgres==5.4.0
   apache-airflow-providers-salesforce==5.3.0
   apache-airflow-providers-sftp==4.2.4
   apache-airflow-providers-sqlite==3.3.1
   apache-airflow-providers-ssh==3.6.0
   ```
   
   ### Apache Airflow version
   
   2.5.3
   
   ### Operating System
   
   Debian GNU/Linux 11 (bullseye)
   
   ### Deployment
   
   Virtualenv installation
   
   ### Deployment details
   
   - Using PostgreSQL 14.7
   
   ### What happened
   
   Starting `apache-airflow-providers-microsoft-azure==5.0.0`, the `get_link()` 
function doesn't return a URL value for the 
`AzureDataFactoryRunPipelineOperator` operator due to a class instance check in 
the commit 
[78b8ea2f22](https://github.com/apache/airflow/commit/78b8ea2f22239db3ef9976301234a66e50b47a94)
   
   **Web server log :**
   ```
    {{data_factory.py:52}} INFO - The <class 
'airflow.serialization.serialized_objects.SerializedBaseOperator'> is not 
<class 
'airflow.providers.microsoft.azure.operators.data_factory.AzureDataFactoryRunPipelineOperator'>
 class.
   ```
   
   ### What you think should happen instead
   
   An URL link should be generated during the run of the 
`AzureDataFactoryRunPipelineOperator`.
   
   ### How to reproduce
   
   **DAG used to reproduce the problem :**
   
   ```
   from airflow import DAG
   from datetime import datetime, timedelta
   from airflow.providers.microsoft.azure.operators.data_factory import 
AzureDataFactoryRunPipelineOperator
   
   from airflow.models import Variable
   
   import os
   
   os.environ["HTTP_PROXY"] = "xxxx"
   os.environ["HTTPS_PROXY"] = "xxxx"
   
   with DAG(
       dag_id='azure_data_factory',
       default_args={
           'owner': 'airflow',
           'depends_on_past': False,
           'email_on_failure': False,
           'email_on_retry': False,
           'retries': 0,
           'retry_delay': timedelta(minutes=5),
       },
       start_date=datetime(2023, 1, 1),
       schedule=None,
       max_active_runs=1,
       catchup=False,
   ) as dag:
       run_test_pipeline = AzureDataFactoryRunPipelineOperator(
           task_id="run_test_pipeline",
           azure_data_factory_conn_id=Variable.get("ADF_CONNECTION_NAME"),
           pipeline_name=Variable.get("ADF_PIPELINE_NAME_DEMO"),
           wait_for_termination=True,
           check_interval=30
       )
   
       run_test_pipeline
   ```
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to