The recommended options with the current log stack are either to
reconfigure your log to send to stdout or add a sidecar container that is
capable of tailing the log in question which would write it to stdout and
ultimately read by fluentd.

On Wed, Aug 15, 2018 at 2:47 AM, Leo David <[email protected]> wrote:

> Hi Everyone,
> I have logging with fluentd / elasticsearch at cluster level running
> fine,  everything works as expected.
> I have an issue though...
> What would it be the procedure to add some custom log files from different
> containers ( logs that are not shown in stdout ) to be delivered to
> elasticseach as well ?
> I two different clusters ( 3.7 and 3.9 ) up and running,  and i know that
> in 3.7 docker logging driver is configured with journald whilst in 3.9 is
> json-file.
> Any thoughts on this ?
> Thanks a lot !
>
> --
> Best regards, Leo David
>
> _______________________________________________
> users mailing list
> [email protected]
> http://lists.openshift.redhat.com/openshiftmm/listinfo/users
>
>


-- 
--
Jeff Cantrill
Senior Software Engineer, Red Hat Engineering
OpenShift Logging
Red Hat, Inc.
*Office*: 703-748-4420 | 866-546-8970 ext. 8162420
[email protected]
http://www.redhat.com
_______________________________________________
users mailing list
[email protected]
http://lists.openshift.redhat.com/openshiftmm/listinfo/users

Reply via email to