boltonidze opened a new issue, #49596:
URL: https://github.com/apache/airflow/issues/49596

   ### Apache Airflow version
   
   2.10.5
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   I want to start and end my deferred task from triggerrer. I use a little bit 
modified airflow operator for that:
   
   ```python3
   from datetime import timedelta
   from typing import Any, AsyncIterator, List, Union, cast
   
   from airflow.triggers.base import StartTriggerArgs
   from airflow.triggers.base import  TriggerEvent, TaskSuccessEvent
   from airflow.providers.amazon.aws.sensors.s3 import S3KeySensor
   from airflow.providers.amazon.aws.triggers.s3 import S3KeyTrigger
   
   class MyS3KeyTrigger(S3KeyTrigger):
       def __init__(self, *, end_from_trigger: bool = True, **kwargs):
           super().__init__(**kwargs)
           self.end_from_trigger = end_from_trigger
       
       def serialize(self) -> tuple[str, dict[str, Any]]:
           """Serialize the trigger param and module path."""
           _, json_args = super().serialize()
           json_args['end_from_trigger'] = self.end_from_trigger
           self.log.info(f"SERIALIZE: {json_args}")
           
           return ('MyS3KeyTrigger', json_args)
       
       async def run(self) -> AsyncIterator[TriggerEvent]:  # type: 
ignore[override]
           async for event in super().run():
               if self.end_from_trigger and event.payload.get("status") == 
"success":
                   yield TaskSuccessEvent()
               else:
                   yield event
   
   class S3KeySensorAsync(S3KeySensor):
       def __init__(
               self,
               *,
               bucket_key: Union[str, List[str]],
               deferrable: bool = True,
               end_from_trigger: bool = True,
               start_from_trigger: bool = False,
               **kwargs: Any,
       ):
           super().__init__(bucket_key=bucket_key, deferrable=deferrable, 
**kwargs)
           self.end_from_trigger = end_from_trigger
           self.start_from_trigger = start_from_trigger
           self.start_trigger_args = StartTriggerArgs(
               trigger_cls="MyS3KeyTrigger",
               trigger_kwargs={
                   "bucket_name": cast(str, self.bucket_name),
                   "bucket_key": self.bucket_key,
                   "wildcard_match": self.wildcard_match,
                   "aws_conn_id": self.aws_conn_id,
                   "verify": self.verify,
                   "poke_interval": self.poke_interval,
                   "should_check_fn": bool(self.check_fn),
                   "use_regex": self.use_regex,
                   "end_from_trigger": self.end_from_trigger
               },
               next_method="execute_complete",
               next_kwargs=None,
               timeout=timedelta(seconds=self.timeout),
           )
   
       def _defer(self) -> None:
           """Check for a keys in s3 and defers using the triggerer."""
           self.defer(
               timeout=timedelta(seconds=self.timeout),
               trigger= MyS3KeyTrigger(
                   bucket_name=cast(str, self.bucket_name),
                   bucket_key=self.bucket_key,
                   wildcard_match=self.wildcard_match,
                   aws_conn_id=self.aws_conn_id,
                   verify=self.verify,
                   poke_interval=self.poke_interval,
                   should_check_fn=bool(self.check_fn),
                   use_regex=self.use_regex,
                   end_from_trigger=self.end_from_trigger,
               ),
               method_name="execute_complete",
           )
   ```
   
   But when I try to create a sensor task with a macros 
   ```python3
   s3_sensor = S3KeySensor(
           task_id="s3_key_sensor",
           bucket_key=f"s3://some-s3-bucket/{{{{ ds }}}}",
           start_from_trigger=True,
           poke_interval=60,
       )
   ```
   
   Macros don't work a trigger task looks like this:
   
   `{'verify': None, 'use_regex': False, 'bucket_key': 's3://some-s3-bucket/{{ 
ds }}', 'aws_conn_id': 'aws_default', 'bucket_name': None, 'poke_interval': 
60.0, 'wildcard_match': False, 'should_check_fn': False, 'end_from_trigger': 
True}`
   
   ### What you think should happen instead?
   
   Macros should work
   
   ### How to reproduce
   
   Use the code snippet above
   
   ### Operating System
   
   Debian GNU/Linux 12 (bookworm)
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Other 3rd-party Helm chart
   
   ### Deployment details
   
   _No response_
   
   ### Anything else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to