Just to throw it out there, is there a reason you wouldn't take the daily results from logwatch, and pump those into elasticsearch? If dealing with the hundreds of emails is the issue, then that could let you make a query to show (for example) which users had the most login failures across all systems each day.
It's also worth mentioning that kibana can be used to help you develop complex queries; you just need to build the search you want, and inspect the element containing it to get the json it uses. You can put an "index" variable into a script, change that daily, and use it in each query (removing the timestamp limitations, obviously) to run the same set of queries daily against the results of your logwatch. -- The information transmitted in this email is intended only for the person(s) or entity to which it is addressed and may contain confidential and/or privileged material. Any review, retransmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this email in error, please contact the sender and permanently delete the email from any computer. -- You received this message because you are subscribed to the Google Groups "elasticsearch" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/7a46e62b-c5e0-404f-bc42-5db34fc042eb%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
