RaphCodec opened a new issue, #56275:
URL: https://github.com/apache/airflow/issues/56275

   ### Apache Airflow version
   
   3.1.0
   
   ### If "Other Airflow 2 version" selected, which one?
   
   _No response_
   
   ### What happened?
   
   I am running Airflow 3.1.0 with the LocalExecutor, using Docker Compose.  I 
have a simple DAG shown below just to test out the PythonVirtualenvOperator.  
When I run the DAG Airflow automatically uses UV to create the venv in a tmp 
directory.  The DAG then runs sucessfully, but at the end there is a warning 
that the venv failed to be deleted becuase Airflow for some reason is now 
looking at a different directory for the venv.
   
   Originally the venv was create in /tmp but the airflow tried to delete the 
venv from /opt/airflow/tmp.  Therefore I tried setting the UV_Cache_DIR to 
/opt/airflow/tmp but now Airflow tries to delete the venv from 
/opt/airflow/opt/airflow/tmp. 
   
   Log with venv delete failure:
   ```
   INFO - DAG bundles loaded: dags-folder
   INFO - Filling up the DagBag from /opt/airflow/dags/duckdb_example.py
   INFO - Executing cmd: uv venv --allow-existing --seed --python python 
/opt/airflow/tmp/venvizdpi6sh
   INFO - Output:
   INFO - Using CPython 3.12.11 interpreter at: /usr/python/bin/python
   INFO - Creating virtual environment with seed packages at: tmp/venvizdpi6sh
   INFO -  + pip==25.2
   INFO - Executing cmd: uv pip install --python 
/opt/airflow/tmp/venvizdpi6sh/bin/python -r 
/opt/airflow/tmp/venvizdpi6sh/requirements.txt
   INFO - Output:
   INFO - Resolved 7 packages in 167ms
   INFO - Installed 7 packages in 3.50s
   INFO -  + duckdb==1.1.1
   INFO -  + numpy==2.3.0
   INFO -  + pandas==2.3.2
   INFO -  + python-dateutil==2.9.0.post0
   INFO -  + pytz==2025.2
   INFO -  + six==1.17.0
   INFO -  + tzdata==2025.2
   INFO - Executing cmd: /opt/airflow/tmp/venvizdpi6sh/bin/python 
/opt/airflow/tmp/venv-callm_tfnrbj/script.py 
/opt/airflow/tmp/venv-callm_tfnrbj/script.in 
/opt/airflow/tmp/venv-callm_tfnrbj/script.out 
/opt/airflow/tmp/venv-callm_tfnrbj/string_args.txt 
/opt/airflow/tmp/venv-callm_tfnrbj/termination.log 
/opt/airflow/tmp/venv-callm_tfnrbj/airflow_context.json
   INFO - Output:
   INFO - Query result:
   INFO -     answer
   INFO - 0      42
   WARNING - Fail to delete /opt/airflow/opt/airflow/tmp/venvizdpi6sh. The 
directory does not exist.
   INFO - Done. Returned value was: None
   ```
   
   
   
   ### What you think should happen instead?
   
   Airflow should look for the venv in the same directory where it is created 
instead of always prefixing the dir with /opt/airflow/ when it's time to delete 
the venv.
   
   ```
   INFO - Deleted venv /opt/airflow/tmp/venvizdpi6sh.
   ```
   
   
   ### How to reproduce
   
   Below is the example DAG that I am running, Along with a snippet of the 
Docker Compose File that I have.  The docker compose file is otherwise 
unchanged except for the fact that I removed flower and switched celery to 
LocalExecutor.
   
   
   Example DAG
   ```python
   from airflow.sdk import DAG, task
   import pendulum
   
   with DAG(
       dag_id="duckdb_example",
       description="A simple dag to show the use of a python virtual env with 
duckdb",
       start_date=pendulum.datetime(2025, 8, 1, tz="EST"),
       schedule="@daily",
       catchup=False,
   ):
       @task.virtualenv(
           task_id="virtualenv_duckdb", 
           requirements=["duckdb==1.1.1", "numpy==2.3.0", "pandas==2.3.2"], 
           system_site_packages=False,
       )
       def run_query():
           import duckdb
           import time
           result = duckdb.query("SELECT 42 AS answer").to_df()
           print("Query result:\n", result)
   
       run_query()
   ```
   
   
   
   ### Operating System
   
   Linux - Ubuntu
   
   ### Versions of Apache Airflow Providers
   
   _No response_
   
   ### Deployment
   
   Docker-Compose
   
   ### Deployment details
   
   I'm using Docker Compose with the below DockerFile.
   
   ```
   FROM apache/airflow:3.1.0
   ADD requirements.txt .
   RUN pip install apache-airflow==${AIRFLOW_VERSION} -r requirements.txt
   ```
   
   The docker compose file is based off the offical Airflow file. I switched 
celery for LocalExecutor since I'm just testing Airflow locally.  The below 
section is what I mainly changed in the docker compose file. The only other 
things I chaged were removing parts of the file that celery would use such as 
Flower
   
   ```yaml
   x-airflow-common:
     &airflow-common
     # In order to add custom dependencies or upgrade provider distributions 
you can use your extended image.
     # Comment the image line, place your Dockerfile in the directory where you 
placed the docker-compose.yaml
     # and uncomment the "build" line below, Then run `docker-compose build` to 
build the images.
     # image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:3.1.0}
     build: .
     environment:
       &airflow-common-env
       AIRFLOW__CORE__EXECUTOR: LocalExecutor
       AIRFLOW__CORE__AUTH_MANAGER: 
airflow.providers.fab.auth_manager.fab_auth_manager.FabAuthManager
       AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: 
postgresql+psycopg2://airflow:airflow@postgres/airflow
       AIRFLOW__CELERY__RESULT_BACKEND: 
db+postgresql://airflow:airflow@postgres/airflow
       AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
       AIRFLOW__CORE__FERNET_KEY: ''
       AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
       AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
       AIRFLOW__CORE__EXECUTION_API_SERVER_URL: 
'http://airflow-apiserver:8080/execution/'
       AIRFLOW__SCHEDULER__ENABLE_HEALTH_CHECK: 'true'
       # WARNING: Use _PIP_ADDITIONAL_REQUIREMENTS option ONLY for a quick 
checks
       # for other purpose (development, test and especially production usage) 
build/extend Airflow image.
       _PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
       AIRFLOW_CONFIG: '/opt/airflow/config/airflow.cfg'
       # Set temp directory to use mounted volume
       TMPDIR: '/opt/airflow/tmp'
       UV_CACHE_DIR: '/opt/airflow/tmp'
       UV_LINK_MODE: 'copy'
     volumes:
       - ${AIRFLOW_PROJ_DIR:-.}/dags:/opt/airflow/dags
       - ${AIRFLOW_PROJ_DIR:-.}/logs:/opt/airflow/logs
       - ${AIRFLOW_PROJ_DIR:-.}/config:/opt/airflow/config
       - ${AIRFLOW_PROJ_DIR:-.}/plugins:/opt/airflow/plugins
     user: "${AIRFLOW_UID:-50000}:0"
     depends_on:
       &airflow-common-depends-on
       redis:
         condition: service_healthy
       postgres:
         condition: service_healthy
   
   ```
   
   ### Anything else?
   
   This issue occurs with every PythonVirtualenvOperator DAG I've tried 
recently 
   
   ### Are you willing to submit PR?
   
   - [x] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [x] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to