Are you using the DockerOperator? If so, the issue is here in the
create_contianer method:
https://github.com/apache/incubator-airflow/blob/master/airflow/operators/docker_operator.py#L207
create_container takes a host_config:
https://docker-py.readthedocs.io/en/1.2.3/api/#create_container
That host_config takes a log_config:
https://docker-py.readthedocs.io/en/1.2.3/hostconfig/
The log_config needs to be set to: {'type': 'json-file'}
So the create_contianer method should look like so:
self.container = self.cli.create_container(
command=self.get_command(),
cpu_shares=cpu_shares,
environment=self.environment,
host_config=self.cli.create_host_config(
binds=self.volumes,
network_mode=self.network_mode,
shm_size=self.shm_size,
log_config={'type': 'json-file'}),
image=image,
mem_limit=self.mem_limit,
user=self.user,
working_dir=self.working_dir)
Unfortunately, log_config is not a param in the DockerOperator so you'll
need to make a pull request to fix it.
On Sun, Sep 16, 2018 at 2:12 AM Bhavani Ramasamy
wrote:
> Hello Team,
> I am trying to setup S3 logging with docker & CeleryExecutor. Files are not
> written to S3. I have configured in airflow.cfg like below,
>
> remote_logging = True
>
> remote_log_conn_id = s3_connection_mine
>
> remote_base_log_folder = s3:// mybucket/airflow/logs/
>
>
> I have tried with *logging_config_class* as empty as well as custom
> log_config.py using airflow_local_settings.py file. It also doesnt work.
> Can you please help me.
>
>
> Thanks,
>
> Bhavani
>
--
Kyle Hamlin