AnastasiyaIvanovaNovade opened a new issue #19368:
URL: https://github.com/apache/airflow/issues/19368
### Apache Airflow version
2.2.1 (latest released)
### Operating System
Linux (ubuntu 20.04)
### Versions of Apache Airflow Providers
```
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-microsoft-azure==3.1.0
apache-airflow-providers-microsoft-mssql==2.0.0
apache-airflow-providers-postgres==2.0.0
apache-airflow-providers-salesforce==3.1.0
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.1.0
```
### Deployment
Other
### Deployment details
Airflow installed on Azure virtual machine (Standard D8s v3 (8 vcpus, 32 GiB
memory)), the VM is dedicated for Airflow only
### What happened
After I updated Airflow from 2.1.3 to 2.2.1 and run db update, I run
`airflow scheduler` and got an error:
`[2021-11-02 11:43:20,846] {scheduler_job.py:644} ERROR - Exception when
executing SchedulerJob._run_scheduler_loop
Traceback (most recent call last):
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 628, in _execute
self._run_scheduler_loop()
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 709, in _run_scheduler_loop
num_queued_tis = self._do_scheduling(session)
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 782, in _do_scheduling
self._create_dagruns_for_dags(guard, session)
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/airflow/utils/retries.py",
line 76, in wrapped_function
for attempt in run_with_db_retries(max_retries=retries, logger=logger,
**retry_kwargs):
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/tenacity/__init__.py",
line 382, in __iter__
do = self.iter(retry_state=retry_state)
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/tenacity/__init__.py",
line 349, in iter
return fut.result()
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 437, in result
return self.__get_result()
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 389, in
__get_result
raise self._exception
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/airflow/utils/retries.py",
line 85, in wrapped_function
return func(*args, **kwargs)
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 847, in _create_dagruns_for_dags
self._create_dag_runs(query.all(), session)
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 917, in _create_dag_runs
self._update_dag_next_dagruns(dag, dag_model,
active_runs_of_dags[dag.dag_id])
File
"/home/DataPipeline/.local/lib/python3.8/site-packages/airflow/jobs/scheduler_job.py",
line 926, in _update_dag_next_dagruns
if total_active_runs >= dag_model.max_active_runs:
TypeError: '>=' not supported between instances of 'int' and 'NoneType'`
The scheduler is not able to run.
### What you expected to happen
This behaviour happens right after the update. I have another VM with
Airflow (the two machines are similar, the difference in location of the VM).
### How to reproduce
after the update to 2.2.1, change the config in accordance with Warnings and
run `airflow scheduler`
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]