mchaniotakis commented on issue #39717:
URL: https://github.com/apache/airflow/issues/39717#issuecomment-2230194631
I am experiencing the same error. I am not and Airflow experienced user,
however I am fairly certain this error is not caused from limited resources. In
my case it seems to be when I create dags dynamically in the following fashion:
```
"owner": "airflow",
"depends_on_past": False,
"start_date": days_ago(2),
"email": ["..."],
"email_on_failure": False,
"email_on_retry": False,
"retries": 0,
"retry_delay": timedelta(minutes=5),
"schedule_interval": "@once",
}
# This part bellow is simplified
my_items = get_items() # my_items : dict
seq_tasks = []
for key, value in my_items:
seq_tasks.append(PythonOperator(
task_id=format_dag_task_name(
f"task_{key}"
),
python_callable=collect_data,
op_args=[
value,
],
# owner is always community for base models
op_kwargs={
"collect_all" : True,
},
pool="collect_data_pool", # no other dag is active when I
run this dag and the error happens
retries=0,
provide_context=True,
dag=None
))
for task in exec_tasks:
task.dag = dag
for i in range(1, len(seq_tasks)):
print(f"({i} Connecting {seq_tasks[i-1]} to {seq_tasks[i]}")
seq_tasks[i-1] >> seq_tasks[i]
```
I have no changes in the config file except for smtp and the env variables
on the docker compose named here (for python intepreter and donot_pickle)
Airflow is set up on a single machine through docker compose:
apache/airflow:2.9.1-python3.10
Airflow pip packages: apache-airflow-providers-http
apache-airflow-providers-slack[http] apache-airflow-providers-docker
Let me know if this error is irrelevant to the current issue and I will
delete to avoid cluttering.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]