CodingJonas opened a new issue #9311: URL: https://github.com/apache/airflow/issues/9311
**Apache Airflow version**: 2.0.0dev - **OS** (e.g. from /etc/os-release): Ubuntu 18.04 - **Install tools**: pip **What happened**: When running a DockerSwarmOperator the service finishes, but Airflow will detect that finished service only 60 seconds after it has finished. The issue lies in this line: https://github.com/apache/airflow/blob/832593a9fc80f4b56605f3cbf656375bd94cc136/airflow/providers/docker/operators/docker_swarm.py#L171 ``` def _stream_logs_to_output(self): logs = self.cli.service_logs( self.service['ID'], follow=True, stdout=True, stderr=True, is_tty=self.tty ) line = '' while True: try: log = next(logs) ``` When removing this it works as expected. `next(logs)` is a blocking call, and for some reason the docker-py library, which is behind this call, does not recognize the finished service. After I think exactly 60 seconds this call crashes, which allows the operator to continue. **How to reproduce it**: Any DAG using the DockerSwarmOperator works, e.g.: ``` task1 = DockerSwarmOperator( task_id='docker_swarm_validator', image='alpine:3.11.5', api_version='auto', command='echo X', tty=True, ) ``` **Anything else we need to know**: A workaround I found was to execute the logging in a separate process, while checking for the current status of the service in the main process. Once the service has finished, the logging process can simply be terminated. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected]
