Hi everyone,
a customer of ours uses aggregated logging on the OpenShift platform and
has two needs:
- forwarding all logs to an external Kafka instance;
- aggregating multiline logs (e.g. Java stack traces) on the fluentd
  side before sending them to Kafka.

I have packaged the Fluentd gems providing those functionalities as RPMs
(except for a few development prerequisites required for unit testing)
and have prepared several patches to the openshift-ansible installation
playbook and to the fluentd log aggregator repository.

I would like to contribute this work upstream; since this involves
coordination between several repositories (and possibly several review
rounds), could you please give me some advice about how I should proceed?

Thanks in advance,
Alessandro Menti

dev mailing list

Reply via email to