Shadowsong27 opened a new issue #9486: URL: https://github.com/apache/airflow/issues/9486
<!-- Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions. Don't worry if they're not all applicable; just try to include what you can :-) If you need to include code snippets or logs, please put them in fenced code blocks. If they're super-long, please use the details tag like <details><summary>super-long log</summary> lots of stuff </details> Please delete these comment blocks before submitting the issue. --> <!-- IMPORTANT!!! PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE NEXT TO "SUBMIT NEW ISSUE" BUTTON!!! PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!! Please complete the next sections or the issue will be closed. This questions are the first thing we need to know to understand the context. --> **Apache Airflow version**: v1.10.10 **Kubernetes version (if you are using kubernetes)** (use `kubectl version`): **Environment**: - **Cloud provider or hardware configuration**: AWS - **OS** (e.g. from /etc/os-release): Deployed in ECS Fargate - **Install tools**: poetry **What happened**: I am trying to display the logs of Fargate Tasks in Airflow UI when using ECSOperator, and it is achieved by providing the following Task argument according to the docs. 1. `awslogs_group` 2. `awslogs_region` 3. `awslogs_stream_prefix` However, it failed to work with an error msg ``` An error occurred (ResourceNotFoundException) when calling the GetLogEvents operation: The specified log stream does not exist ``` I went on and examine the source code and realise the log stream in the [source code](https://github.com/apache/airflow/blob/a00e188ded01028409e041130d1e4f02e4e3a109/airflow/providers/amazon/aws/operators/ecs.py#L208) was constructed using this ``` stream_name = "{}/{}".format(self.awslogs_stream_prefix, task_id) ``` where in the [official AWS docs](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/using_awslogs.html), it states ``` The awslogs-stream-prefix option allows you to associate a log stream with the specified prefix, the container name, and the ID of the Amazon ECS task to which the container belongs. If you specify a prefix with this option, then the log stream takes the following format: prefix-name/container-name/ecs-task-id ``` And after I manually patch my container name into the Airflow Task argument `awslogs_stream_prefix` it works (my current workaround). Personally I am not sure whether this issue arises due to the specific configuration in Fargate i have, or it is generic enough to be considered as a bug. <!-- (please include exact error messages if you can) --> **What you expected to happen**: Logs displaying in Airflow Log UI <!-- What do you think went wrong? --> log stream prefix is not dynamically constructed properly **How to reproduce it**: Running any ECS Fargate task with ECS Operator with default awslog configuration should be able to reproduce this. <!--- As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags. If you are using kubernetes, please attempt to recreate the issue using minikube or kind. ## Install minikube/kind - Minikube https://minikube.sigs.k8s.io/docs/start/ - Kind https://kind.sigs.k8s.io/docs/user/quick-start/ If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action You can include images using the .md sytle of  To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file. ---> **Anything else we need to know**: <!-- How often does this problem occur? Once? Every time etc? Any relevant logs to include? Put them here in side a detail tag: <details><summary>x.log</summary> lots of stuff </details> --> ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
