potiuk commented on issue #17120:
URL: https://github.com/apache/airflow/issues/17120#issuecomment-884753926
I think I know the reason. In Scheduler, the `SchedulerJob` is instantiated
before demon context is activated. SchedulerJob is a database ORM object from
SQL Alchemy and it opens the connection to Postgres:
```
job = SchedulerJob(
subdir=process_subdir(args.subdir),
num_runs=args.num_runs,
do_pickle=args.do_pickle,
)
```
When you activate daemon context, what happens under the hood is forking the
process, and while some of the opened sockets are passed to the forks (stdin
and stderr but also the opened log file handle), the established socket for DB
connection is not passed:
```
handle = setup_logging(log_file)
with open(stdout, 'w+') as stdout_handle, open(stderr, 'w+') as
stderr_handle:
ctx = daemon.DaemonContext(
pidfile=TimeoutPIDLockFile(pid, -1),
files_preserve=[handle],
stdout=stdout_handle,
stderr=stderr_handle,
)
```
I will add a fix for that in a moment
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]