chriscmorgan opened a new issue, #28452:
URL: https://github.com/apache/airflow/issues/28452

   ### Apache Airflow Provider(s)
   
   docker
   
   ### Versions of Apache Airflow Providers
   
   apache-airflow-providers-celery==3.1.0
   apache-airflow-providers-docker==3.3.0
   
   ### Apache Airflow version
   
   2.5.0
   
   ### Operating System
   
   centos 7
   
   ### Deployment
   
   Other Docker-based deployment
   
   ### Deployment details
   
   Running an a docker-swarm cluster deployed locally. 
   
   ### What happened
   
   Same issue as https://github.com/apache/airflow/issues/13675
   
   With logging_enabled=True the DAG never completes and stays in running. 
   
   When using DockerSwarmOperator together with the default enable_logging=True 
option, tasks do not succeed and stay in state running. When checking the 
docker service logs I can clearly see that the container ran and ended 
successfully. Airflow however does not recognize that the container finished 
and keeps the tasks in state running.
   
   ### What you think should happen instead
   
   DAG should complete.
   
   ### How to reproduce
   
   Docker-compose deployment:
   `
   curl -LfO 
'https://airflow.apache.org/docs/apache-airflow/2.5.0/docker-compose.yaml'
   docker compose up airflow-init
   docker compose up -d
   `
   
   DAG code:
   
   `
   from airflow import DAG
   from docker.types import Mount, SecretReference
   from airflow.providers.docker.operators.docker_swarm import 
DockerSwarmOperator
   from datetime import timedelta
   from airflow.utils.dates import days_ago
   from airflow.models import Variable
   from ssi_python_modules import task_fail_slack_alert
   # Get airflow variables
   etl_processes_version = Variable.get("etl_processes_version")
   etl_processes_ecr_repository = Variable.get("etl_processes_ecr_repository")
   
   # Setup default args for the job
   default_args = {
        'owner': 'vcgstest',
        'start_date': days_ago(2),
        'retries': 0,
        'on_failure_callback': task_fail_slack_alert
   }
    
   # Create the DAG
   dag = DAG(
       'patient_linking_dag',         # DAG ID
       default_args=default_args,
       schedule_interval='0 0 * * *', # At midnight each day
       catchup=False
   )
   
   # # Create the DAG object
   with dag as dag:
          docker_swarm_task = DockerSwarmOperator(
                        task_id="job_run",
                        
image=f'{etl_processes_ecr_repository}:{etl_processes_version}',
                        execution_timeout=timedelta(minutes=90),
                        command="<specific code>",
                        api_version='auto',
                        tty=True,
                        enable_logging=True,
                        docker_url="tcp://<hostname>:2376",
                        tls_ca_cert="/run/secrets/ca.pem",
                        tls_client_cert="/run/secrets/cert.pem",
                        tls_client_key="/run/secrets/key.pem"
                )
   `
   
   
   
   ### Anything else
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to