V0lantis edited a comment on issue #19001:
URL: https://github.com/apache/airflow/issues/19001#issuecomment-1006816889


   There is definitely a mistake from my side, since the following command : 
    ```bash
   cat <<EOF > tmp.py
   import os
   
   os.environ['AIRFLOW__CORE__LOGGING_LEVEL'] = 'ERROR'
   os.environ['AIRFLOW__LOGGING__LOGGING_LEVEL'] = 'ERROR'
   from airflow.jobs.scheduler_job import SchedulerJob
   from airflow.utils.db import create_session
   from airflow.utils.net import get_hostname
   import sys
   
   with create_session() as session:
       job = 
session.query(SchedulerJob).filter_by(hostname=get_hostname()).order_by(
           SchedulerJob.latest_heartbeat.desc()).limit(1).first()
   sys.exit(0 if job.is_alive() else 1)
   EOF
   
   
   /entrypoint python -Wignore tmp.py
   ```
   
   return 
   
   ```bash
   Traceback (most recent call last):
     File "/opt/airflow/tmp.py", line 5, in <module>
       from airflow.jobs.scheduler_job import SchedulerJob
   ModuleNotFoundError: No module named 'airflow'
   ```
   
   I am using a custom image to import private dependancies. Therefore, my 
airflow image is defined by :
   
   <details>
   
   ```Dockerfile
   FROM python:3.9-slim AS compile-image
   RUN apt-get update
   RUN apt-get install -y --no-install-recommends build-essential gcc git ssh 
libpq-dev python3-dev
   
   RUN python -m venv /opt/venv
   # Make sure we use the virtualenv:
   ENV PATH="/opt/venv/bin:$PATH"
   
   RUN mkdir -p -m 0600 ~/.ssh && ssh-keyscan github.com >> ~/.ssh/known_hosts
   COPY requirements.txt .
   RUN --mount=type=ssh,id=github_ssh_key pip install \
       --no-cache \
       -r requirements.txt
   
   FROM apache/airflow:2.2.1-python3.9
   
   COPY --from=compile-image --chown=airflow /opt/venv /opt/venv
   ENV PATH="/opt/venv/bin:$PATH"
   ENV PYTHONPATH="/opt/venv/lib/python3.9/site-packages/:$PYTHONPATH"
   ```
   
   </details>
   
   I'll try with the normal deployment setup. 
   
   **Update**
   
   Thank you for your comment. Since it is not normal that I cannot import 
airflow from the CLI, I looked back at how I defined my image, and I updated it 
from [airflows 
guidelines](https://airflow.apache.org/docs/docker-stack/build.html#example-when-you-add-packages-requiring-compilation).
 
   
   I can now time `/entrypoint python -Wignore tmp.py` : 
   
   ```
   real 0m3.120s
   user 0m2.916s
   sys  0m0.183s
   ```
   I think the root reason was my image. I am going to pass the LivenessProbe 
to its ancient values.
   
   Here is the new definition of my image : 
   
   ```Dockerfile
   
   FROM apache/airflow:2.2.1-python3.9
   
   USER root
   
   RUN apt-get update
   RUN apt-get install -y --no-install-recommends build-essential gcc git ssh 
libpq-dev python3-dev \
       && apt-get autoremove -yqq --purge \
       && apt-get clean \
       && rm -rf /var/lib/apt/lists/*
   
   
   RUN mkdir -p -m 0600 ~/.ssh && ssh-keyscan github.com >> ~/.ssh/known_hosts
   COPY requirements.txt .
   RUN --mount=type=ssh,id=github_ssh_key pip install \
       --no-cache \
       -r requirements.txt
   
   USER airflow
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to