Fokko, Thanks for clarifying.
The tasks are still in waiting state, and they are not being run. The tasks
are generated from scheduler, but not being executed from Celery.
There is no communication happening between Celery and Airflow scheduler..
[2017-12-07 18:41:12,155] {dag_processing.py:559} INFO - Processor for
recon_daily.py finished
*[2017-12-07 18:41:12,158] {dag_processing.py:627} INFO - Started a process
(PID: 23371) to generate tasks for recon_daily.py - logging into
/logs/scheduler/2017-12-07/recon_daily.py.log*
[2017-12-07 18:41:12,170] {jobs.py:1002} INFO - No tasks to send to the
executor
[2017-12-07 18:41:12,170] {jobs.py:1440} INFO - Heartbeating the executor
[2017-12-07 18:41:12,179] {jobs.py:1450} INFO - Heartbeating the scheduler
[2017-12-07 18:41:13,187] {jobs.py:1404} INFO - Heartbeating the process
manager
[2017-12-07 18:41:13,187] {dag_processing.py:559} INFO - Processor for
recon_daily.py finished
[2017-12-07 18:41:13,190] {dag_processing.py:627} INFO - Started a process
(PID: 23376) to generate tasks for check_recon_daily.py - logging into
/logs/scheduler/2017-12-07/recon_daily.py.log
[2017-12-07 18:41:13,202] {jobs.py:1002} INFO - No tasks to send to the
executor
[2017-12-07 18:41:13,203] {jobs.py:1440} INFO - Heartbeating the executor
[
On Wed, Dec 6, 2017 at 11:56 PM, Driesprong, Fokko <[email protected]>
wrote:
> Hi Veeranagouda,
>
> This is expected when you run both the webserver and the worker on the same
> machine. The worker will expose the port 8793 for sending logs. When you
> run the worker on a different machine, and you request the logs from the
> web-ui. The UI will fetch those logs over port 8793. But when you run
> everything on the same machine (with the same config), the logs are found
> locally so the remote transport over port 8793 is not required. You should
> nog experience any issues when ignoring this error.
>
> Please let me know if this answers your question.
>
> Cheers, Fokko
>
> 2017-12-07 4:24 GMT+01:00 Veeranagouda Mukkanagoudar <
> [email protected]>:
>
> > Hi,
> >
> > I am trying to configure the Celery [ same machine as airflow], but
> seeing
> > "Address already in use" error in worker logs .. I see following error,
> if
> > i kill the process listening to 8793 PORT, then process gets stuck.
> anyone
> > has experienced this issue during setup ?
> >
> >
> > Starting flask
> > [2017-12-07 03:01:28,430] {_internal.py:87} INFO - * Running on
> > http://0.0.0.0:8793/ (Press CTRL+C to quit)
> >
> > *************** Following is the exception ****************
> >
> > Using executor CeleryExecutor
> > Starting flask
> > Traceback (most recent call last):
> > File "/usr/bin/airflow", line 28, in <module>
> > args.func(args)
> > File "/usr/lib/python3.6/dist-packages/airflow/bin/cli.py", line 858,
> in
> > serve_logs
> > host='0.0.0.0', port=WORKER_LOG_SERVER_PORT)
> > File "/usr/lib64/python3.6/dist-packages/flask/app.py", line 843, in
> run
> > run_simple(host, port, self, **options)
> > File "/usr/lib/python3.6/dist-packages/werkzeug/serving.py", line 739,
> > in
> > run_simple
> > inner()
> > File "/usr/lib/python3.6/dist-packages/werkzeug/serving.py", line 699,
> > in
> > inner
> > fd=fd)
> > File "/usr/lib/python3.6/dist-packages/werkzeug/serving.py", line 593,
> > in
> > make_server
> > passthrough_errors, ssl_context, fd=fd)
> > File "/usr/lib/python3.6/dist-packages/werkzeug/serving.py", line 504,
> > in
> > __init__
> > HTTPServer.__init__(self, (host, int(port)), handler)
> > File "/usr/lib64/python3.6/socketserver.py", line 453, in __init__
> > self.server_bind()
> > File "/usr/lib64/python3.6/http/server.py", line 136, in server_bind
> > socketserver.TCPServer.server_bind(self)
> > File "/usr/lib64/python3.6/socketserver.py", line 467, in server_bind
> > self.socket.bind(self.server_address)
> > OSError: [Errno 98] Address already in use
> >
>