adrian-edbert commented on issue #57336:
URL: https://github.com/apache/airflow/issues/57336#issuecomment-3454780568

   Tested some more, previously I am using our own fork for the worker, 
reinstalled using the pypi package directly
     
   I still can't get it to work, my exact setup for the worker is using this uv 
script to install
   
   ```
   # pwd = /home/generic_scheduler/test_airflow/
   uv venv --seed --python 3.12 test-venv
   uv pip install apache-airflow[celery]==3.1.0 --python test-venv --constraint 
https://raw.githubusercontent.com/apache/airflow/constraints-3.1.0/constraints-3.12.txt
   uv pip install apache-airflow[celery]==3.1.0 --python test-venv -r 
test_airflow/generic_scheduler-venv/worker_requirements.txt 
   ```
   
   worker_requirements.txt
   ```
   apache-airflow-client
   apache-airflow[statsd]==3.1.0
   mysqlclient==2.2.7
   pymysql==1.1.1
   aiomysql==0.2.0
   authlib==1.6.0
   apache-airflow-providers-fab==2.4.4
   apache-airflow-providers-git==0.0.8
   apache-airflow-providers-apache-spark==5.3.1
   apache-airflow-providers-apache-hdfs==4.10.1
   ```
   
   this is run using systemd
   ```
   [Unit]
   Description=Airflow celery worker daemon
   After=network.target
   
   [Service]
   
EnvironmentFile=/home/generic_scheduler/test_airflow/generic_scheduler.systemd.env
   User=generic_scheduler
   Type=simple
   ExecStart=/bin/bash -c 
"/home/generic_scheduler/test_airflow/test-venv/bin/airflow celery worker -q 
test_queue --pid /home/generic_scheduler/test_airflow/run/worker.pid"
   Restart=on-failure
   RestartSec=10s
   
   [Install]
   WantedBy=multi-user.target
   ```
   
   generic_scheduler.systemd.env
   ```
   AIRFLOW_HOME=/home/generic_scheduler/test_airflow
   
AIRFLOW__SECRETS__BACKEND=airflow.secrets.local_filesystem.LocalFilesystemBackend
   AIRFLOW__SECRETS__BACKEND_KWARGS={"connections_file_path": 
"/home/generic_scheduler/test_airflow/connection.yaml"}
   ```
   
   connection.yaml
   ```
   worker_connection_env:
     conn_type: Generic
     description: worker specific connection env for sanity test
   ```
   
   the full dag looks like this, I omitted some parts before because I don't 
think it's relevant (the queue and access_control), the operator still the same
   ```
   
   with DAG(
       dag_id='testdag',
       catchup=False,
       default_args = {
           "queue": "test_queue",
           "owner": "user",
           "run_as_user": "user"
       },
       access_control={'test_role':{'DAGs': {'can_read'}}, 
'test_role_ops':{"DAGs": {"can_edit"}, "DAG Runs": {"can_create"},}},
       tags=["sanity_test"]
   ):
   
   ```
   
   I am still not sure why the compat shim didn't work for my setup


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to