heavenlxj opened a new issue #19149:
URL: https://github.com/apache/airflow/issues/19149


   ### Apache Airflow version
   
   2.1.3
   
   ### Operating System
   
   unbuntu
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-amazon==2.1.0
   apache-airflow-providers-celery==2.0.0
   apache-airflow-providers-cncf-kubernetes==2.0.2
   apache-airflow-providers-docker==2.1.0
   apache-airflow-providers-elasticsearch==2.0.2
   apache-airflow-providers-ftp==2.0.0
   apache-airflow-providers-google==5.0.0
   apache-airflow-providers-grpc==2.0.0
   apache-airflow-providers-hashicorp==2.0.0
   apache-airflow-providers-http==2.0.0
   apache-airflow-providers-imap==2.0.0
   apache-airflow-providers-microsoft-azure==3.1.0
   apache-airflow-providers-mysql==2.1.0
   apache-airflow-providers-postgres==2.0.0
   apache-airflow-providers-redis==2.0.0
   apache-airflow-providers-sendgrid==2.0.0
   apache-airflow-providers-sftp==2.1.0
   apache-airflow-providers-slack==4.0.0
   apache-airflow-providers-sqlite==2.0.0
   apache-airflow-providers-ssh==2.1.0
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   _No response_
   
   ### What happened
   
    Apache Airflow version: 2.1.3
    Issue:
   
   I need to recording the task run status in my own table so what i did is 
invoke the api in the on_success_callback and on_failure_callback in the 
operator, attach my code here:
   
    def test():
          print("Hello World!!")
   
     def success_op_callback(context):
        print("######   INTO success_op_callback #########")
        _invoke_operator_run(context)   # this is the function do invoking rest 
api for updating the status into mysql table
   
   
   def failure_op_callback(context):
           print("######   INTO failure_op_callback #########")
        _invoke_operator_run(context)
   
   with DAG(
                dag_id="call_test",
                start_date=datetime(2021, 1, 1),
                schedule_interval='0 0 * * *',
         ) as dag:
   
            test = CustomPythonOperator(
                task_id = "callback_test",
                python_callable=test,
                provide_context=True,
                on_success_callback= success_op_callback,
                   on_failure_callback= failure_op_callback,
            )
   
   
   when the dag scheduled, i found every 30s the on_success_callback will be 
invoked, this is not make sense for me, why the callback was triggered even the 
dag is not running, so what should i do for this case to avoid callback 
invoking out of the schedule? 
   
   ### What you expected to happen
   
   scheduler just check the file was updated or not, but the callback or the 
task execution should not be triggered
   
   ### How to reproduce
   
   _No response_
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to