Hi Nicholas. I've got the same situation. I have dockerized airflow workers are running on different EC2s. To resolve this issue, I've set hostname of the docker container as the ip address of the EC2.
If you are using docker compose, you can add *hostname* field to the yaml file. Otherwise, use *-h* option to set hostname. Thanks, Yongjun 2017-08-22 3:16 GMT+09:00 Nicholas Hodgkinson < [email protected]>: > All, > > I've got a problem I'm trying to solve where I've expanded my Airflow > machine into a cluster; this itself works, however I'm having a problem > with the accessing the logs through the UI. I know this is due to the fact > that I run my airflow workers (and other processes) inside docker > containers. I get this error when trying to access logs: > > *** Log file isn't local. > *** Fetching here: > http://f6400f7aea88:8793/log/cjob/queue/2017-08-18T22:44:09.334353 > *** Failed to fetch log file from worker. > > Now I understand that "f6400f7aea88" is the address of the docker container > within docker, however this is not running on the same machine as the > webserver so this address can not be resolved. So my question is: how I can > change either the address that the web UI uses or the address that the > worker reports back? > > Thanks, > -Nik > [email protected] > > -- > > See why we are one of WSJ's Top Tech Companies to Watch > <https://www.wsj.com/articles/collective-healths-goal- > medical-benefits-that-work-1497492300?mg=prod/accounts-wsj> > > *This message may contain confidential, proprietary, or protected > information. If you are not the intended recipient, you may not review, > copy, or distribute this message. If you received this message in error, > please notify the sender by reply email and delete this message.* >
