matejpavlovic-maistra commented on issue #19426: URL: https://github.com/apache/airflow/issues/19426#issuecomment-962881110
 My awslogs_stream_prefix already follows that naming. My curretn solution is ```python class MyECSOperator(ECSOperator): @apply_defaults def __init__(self, xcom_push=False, **kwargs): super(MyECSOperator, self).__init__(**kwargs) self.xcom_push_flag = xcom_push def execute(self, context): try: super().execute(context) except Exception as e: XCom.set( key = 'return_value', value = self.arn.split("/")[-1],dag_id= 'aaaaa', task_id= 'print_the_context', execution_date= context.get('execution_date')) raise ValueError(e) ``` And on failure function reads id of log from xcom and fetchs it using boto3. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
