On 05/20/2018 12:31 PM, Himmat Singh wrote:
Hi Team,
I am using openshift logging image with below version for provides us
centralize logging capability for our openshift cluster and external
environment logs.
|registry.access.redhat.com/openshift3/logging-fluentd:v3.9
<http://registry.access.redhat.com/openshift3/logging-fluentd:v3.9>|
I am trying to add additional functionality on top of above images as
per our additional requirement.
As per requirement, i have created below configuration files to get
node security logs and ingest them to elasticsearch via mux .
Below is source file .. input-pre-secure.conf
|<source>|||
|@type tail|
|@label @INGRESS|
|@id secure-input|
|path /var/log/secure*|
|read_from_head true|
|pos_file /var/log/secure.log.pos|
|tag audit.log|
|format none|
|</source>|
and filter-pre-secure.conf
|<filter audit.log>|||
|@type parser|
|key_name message|
|format grok|
|<grok>|
| pattern (?<timestamp>%{WORD} %{DATA} %{TIME})
%{HOSTNAME:host_target} sshd\[%{BASE10NUM}\]: (?<EventType>%{WORD}
%{WORD}) (?<USERNAME>%{WORD}) from %{IP:src_ip} port %{BASE10NUM:port}|
|</grok>|
|<grok>|
| pattern (?<timestamp>%{WORD} %{DATA} %{TIME})
%{HOSTNAME:host_target} sshd\[%{BASE10NUM}\]: %{DATA:EventType} for
%{USERNAME:username} from %{IP:src_ip} port %{BASE10NUM:port} ssh2|
|</grok>|
|</filter>|
Modified Dockerfile:
|FROM registry.access.redhat.com/openshift3/logging-fluentd:v3.9
<http://registry.access.redhat.com/openshift3/logging-fluentd:v3.9>|||
||
|COPY fluent-plugin-grok-parser-1.0.1.gem .|
|RUN gem install fluent-plugin-grok-parser-1.0.1.gem|
|COPY input-pre-secure.conf /etc/fluent/configs.d/openshift/|
|COPY filter-pre-secure.conf /etc/fluent/configs.d/openshift/|
*I have deployed updated logging images to mux and fluentd
**daemonset**. After making this configuration changes i am not able
to get any of logs to elasticsearch. *
I want all the security logs from /var/log/secure to be filtered
according to our specific requirement and to be written on .operation
index. what all configurations i need to make to have logs to be
written on operation logs.
Please help me with the solution or any suggestion and with correct
configuration files.
So in another issue I commented about this:
https://github.com/openshift/origin-aggregated-logging/issues/1141#issuecomment-389301880
You are going to want to create your own index name and bypass all other
processing.
*
*
*Thanks and Regards, *
*Himmat Singh.*
*
*
*
*
_______________________________________________
users mailing list
[email protected]
http://lists.openshift.redhat.com/openshiftmm/listinfo/users
_______________________________________________
users mailing list
[email protected]
http://lists.openshift.redhat.com/openshiftmm/listinfo/users