GitHub user s-peryt closed a discussion: Getting NotRegistered error on simple 
task despite [celery] imports configuration

I'm encountering a persistent NotRegistered error in my Airflow setup and have 
exhausted the standard debugging steps. Even a minimal test case fails, 
suggesting a fundamental issue with how my Celery worker is being configured by 
Airflow.

**My Environment:**

- Airflow: apache-airflow==3.0.2
- Providers: apache-airflow-providers-celery==3.12.0
- Executor: CeleryExecutor

**Minimal Test Case**
To isolate the issue, I removed all other files from my dags folder, leaving 
only two simple files:

dags/simple_task.py
```
import logging
from airflow.providers.celery.executors.celery_executor import app

log = logging.getLogger(__name__)

@app.task
def my_simple_test_task(message):
    """A minimal task that only logs a message."""
    log.info("SUCCESS! The simple task ran with message: %s", message)
```
dags/test_dag.py
```
from __future__ import annotations
import pendulum
from airflow.decorators import dag, task
from simple_task import my_simple_test_task

@dag(
    dag_id='minimal_celery_test',
    schedule=None,
    start_date=pendulum.now(),
    catchup=False
)
def minimal_celery_test_dag():
    @task
    def trigger_the_simple_task():
        my_simple_test_task.delay("Testing Celery import.")

    trigger_the_simple_task()

minimal_celery_test_dag()
```
**Configuration and Debugging Steps**
My airflow.cfg is configured to import this module:

airflow.cfg
```
[celery]
imports = simple_task
```
I have already tried the following steps multiple times:

1. **Hard Resetting Services**: Completely stopping the airflow scheduler and 
airflow celery worker processes and restarting them.
2. **Clearing Cache**: Deleting all __pycache__ directories and .pyc files from 
my project.
3. **Verifying File Location**: Ensuring both simple_task.py and test_dag.py 
are directly inside the dags folder which is referenced in config.

**The Result**
When I run the minimal_celery_test DAG, the trigger_the_simple_task task sends 
the job, but it immediately fails (as I can see it in Flower dashboard) on the 
worker with the following error:

`NotRegistered('simple_task.my_simple_test_task')`

When I check the Celery worker's startup logs, the `[tasks]` section only lists 
the default Airflow tasks; `my_simple_test_task` is missing, which confirms 
it's not being registered.

**My Question:**
Given that this minimal configuration appears correct, what could be causing 
the Airflow Celery worker to completely ignore the [celery] imports setting in 
airflow.cfg? Are there any other known issues, environmental factors, or 
configurations specific to Airflow 3 that could lead to this behavior?

GitHub link: https://github.com/apache/airflow/discussions/52675

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to