GitHub user GeorgesNass created a discussion: *** Could not read served logs: 
Invalid URL 'http://:8793/log/dag_id= .... No host supplied

# Airflow with Docker Compose: Log reading error (Invalid URL)

## Disclaimer
I have already searched for similar issues in the Airflow GitHub and forums.  
However, I found that most of the existing threads are either ambiguous or 
incomplete.  
To avoid confusion, I am providing **all my configuration files and steps** 
here so that the community can help me resolve this issue precisely.


## Context

I’m trying to deploy Apache Airflow with Docker Compose on a dedicated VM.  
I built a custom image (`airflow-mlops:latest`) and configured Postgres as the 
backend.  
However, I always get an error when trying to read task logs in the Web UI:

```
*** Could not read served logs: Invalid URL 
'http://:8793/log/dag_id=main_pipeline_dag/run_id=manual__2025-09-04T14:39:22.889935+00:00/task_id=task_live_data/attempt=1.log':
 No host supplied
```

---

## Environment

- **VM**: 4 vCPU / 16 GB RAM / 200 GB SSD  
- **OS**: Debian GNU/Linux 12 (bookworm)
- **Python**: 3.10 (inside Airflow container)  
- **Airflow**: 2.10.2  
- **Executor**: LocalExecutor  
- **Database**: Postgres 13 (dedicated container)  
- **Docker / Compose**: latest version

---

## Configuration files

### `.env`

```env
## ==== Airflow Core configuration ====
AIRFLOW__CORE__EXECUTOR=LocalExecutor
AIRFLOW__CORE__BASE_LOG_FOLDER=/opt/airflow/logs
AIRFLOW__CORE__REMOTE_LOGGING=False

## ==== Airflow Webserver configuration ====
AIRFLOW__WEBSERVER__BASE_URL=http://**<REAL_IPV4_ADDRESS>**:8080
AIRFLOW__WEBSERVER__SECRET_KEY=BdvwS7wrRA9s9ZkYmIPhrYq04rQLv6rrIImfo8WV8nCmvLyysIygopv2JmWHCJN6XSg

## ==== Airflow Logging configuration ====
AIRFLOW__LOGGING__WORKER_LOG_SERVER_ADDRESS=**<REAL_IPV4_ADDRESS>**
AIRFLOW__LOGGING__WORKER_LOG_SERVER_PORT=8793
AIRFLOW__LOGGING__BASE_LOG_FOLDER=/opt/airflow/logs
```

### `Dockerfile`

```dockerfile
FROM apache/airflow:2.10.2-python3.10

WORKDIR /opt/airflow

COPY requirements.txt /opt/airflow/requirements.txt

USER airflow
RUN pip install --no-cache-dir --progress-bar=off -r 
/opt/airflow/requirements.txt

COPY --chown=airflow:airflow src /opt/airflow/src
COPY --chown=airflow:airflow dags /opt/airflow/dags

ENV PYTHONPATH="/opt/airflow:${PYTHONPATH}"

EXPOSE 8080
CMD ["webserver"]
```

### `docker-compose.yaml`

```yaml
  GNU nano 7.2                                                                  
         docker-compose.yaml                                                    
                                 
version: "3.9"

x-airflow-common: &airflow-common
  image: airflow-mlops:latest
  env_file:
    - .env
  environment:
    GUNICORN_CMD_ARGS: "--workers 1 --threads 1 --timeout 120"
    PYTHONPATH: /opt/airflow
    AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: 
postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
    AIRFLOW__CORE__EXECUTOR: LocalExecutor
  volumes:
    - ./dags:/opt/airflow/dags
    - ./src:/opt/airflow/src
    - ./data:/opt/airflow/data
    - ./logs:/opt/airflow/logs
  user: "${AIRFLOW_UID:-50000}:0"
  depends_on:
    postgres:
      condition: service_healthy

services:
  postgres:
    image: postgres:13
    environment:
      POSTGRES_USER: airflow
      POSTGRES_PASSWORD: airflow
      POSTGRES_DB: airflow
    volumes:
      - postgres-db-volume:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U airflow"]
      interval: 5s
      retries: 5
    restart: always
    security_opt:
      - seccomp=unconfined

  airflow-webserver:
    <<: *airflow-common
    build:
      context: .
      dockerfile: Dockerfile
    command: webserver
    ports:
      - "8080:8080"
    restart: always

  airflow-scheduler:
    <<: *airflow-common
    command: scheduler
    restart: always

  ## Init (migration DB + création user admin)
  airflow-init:
    <<: *airflow-common
    command: bash -c "airflow db upgrade && airflow users create --username 
airflow --password airflow --firstname Admin --lastname User --role Admin 
--email [email protected]"
    profiles: ["init"]

volumes:
  postgres-db-volume:
```

---

## Commands used

```bash
docker compose down -v --remove-orphans
docker compose build --no-cache
docker compose up airflow-init
docker compose up -d
docker ps
```

---

## Problem

- Airflow UI works (I can log in).  
- DAGs are detected and can be triggered.  
- **BUT** when I try to read logs from a task, I get this error:

```
*** Could not read served logs: Invalid URL 'http://:8793/log/...': No host 
supplied
```

---

How can I fix this **invalid log URL issue**?  

Thanks in advance for any help!


GitHub link: https://github.com/apache/airflow/discussions/55268

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to