potiuk commented on issue #17010:
URL: https://github.com/apache/airflow/issues/17010#issuecomment-931288902


   > It'd be nice to be able to have custom trigger rule, but it seem to be 
already investigated and declined 
[AIRFLOW-389](https://issues.apache.org/jira/browse/AIRFLOW-389)
   > 
   > Me too need 'all done and none upstream_failed' rule after an 'optional' 
task. If 'optional' task fails by itself then downstream tasks should proceed, 
but if it has upstream_failed then downstream tasks should have upstream_failed 
status too.
   > 
   
   I think custom trigger rules are possible but we need to make sure they are 
well implemented from the isolation/security point of view (especially if you 
want to define them in DAGs which should not be allowed). 
   
    Airflow scheduler makes scheduling decisions based on those, and the rule 
is that only Workers and File Processors (which run in isolated, Sandboxed 
Python Processes ) can execute user-DAG-provided code. 
   
   If we would like to implement custom trigger rule logic, it would have to be 
installable via providers/plugins only similar as new scheduling Timetables in 
Airflow 2.2. Which means that they have to be "installed" when Airflow is 
installed rather than added with DAGs.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to