pedro-cf opened a new issue, #42894:
URL: https://github.com/apache/airflow/issues/42894
### Apache Airflow version
2.10.2
### If "Other Airflow 2 version" selected, which one?
_No response_
### What happened?
I'm currently using an airflow instance setup with `CeleryExecutor`.
I have a task assigned to a "heavy" queue:
```python
@task.virtualenv(
trigger_rule=TriggerRule.NONE_FAILED_MIN_ONE_SUCCESS,
requirements=[
"my_package=1.1.1"
],
index_urls=["my_index_url"],
venv_cache_path="/tmp",
queue="heavy"
)
def my_task(arg1, arg2, params=None):
...
```
but for some reason the task is running on a worker configured to only
handle the default queue?
Worker Queues in flower:

DAG RUN details:

### What you think should happen instead?
Tasks assigned to a specific queue should only run in workers assigned to
that specific queue.
### How to reproduce
Not sure how.
### Operating System
Ubuntu 22
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
`CeleryExecutor`
2 workers:
```
celery worker --concurrency ${DEFAULT_WORKER_CONCURRENCY}
celery worker -q heavy --concurrency ${HEAVY_WORKER_CONCURRENCY}
```
### Anything else?
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]