Brian,
I agree completely with your expectations. If I am to replace Splunk
(ridiculously overpriced in my opinion) or Logrhythm, I need to be able to :
1. Generate alerts that need immediate reaction.
2. Generate reports
- Compliance related reporting
- Perform aggregations on the fly (e.g. social-networking traffic by
user by hour of day/week etc.)
- Generate Trend reports (Logon failures by Source IP address by
destination address by hour of day etc.)
3. Provide an analyst a console with which he can quickly drill down to
events of interest.
Of the three broad categories, Kibana does an admirable job of the third.
Alerts can be generated by Logstash or nxlog that feeds ES though this is
kludgy and hard to maintain at best. (Graylog2 streams?)
I was looking to generate reports by scripted process. Basically create a
scripting framework and vary the query as necessary, but this is easier
said than done at the moment.
With respect to Kibana, it is a great product but I believe it took a step
back in functionality from Version 2 to version 3. The pace of development
appears to now be the same as OSSEC WEBUI and I do not see a roadmap for
future releases. It may be that folding the product into the Elasticsearch
umbrella has moved the company in a different direction (I smell a
commercial product coming soon).
I took a look at Graylog2 and I must say it has moved forward by leaps and
bounds over the last year and does have the ability to generate alerts. Not
sure how easy it would be to create custom reports as I have not tried.
What holds me back is that it only supports ES version 0.90.9. Considering
that version does not support aggregations (only facets), I am reluctant to
move my eggs into an old basket considering ES is pushing Ver. 1.3 and
upwards at the moment.
Until the smoke clears up, I believe we will have to rely on building
scripting skills. I am personally struggling with parsing JSON output with
nested fields as effectively into CSV so I can feed R etc. programatically
Regards
Ash Kumar
On Thursday, September 25, 2014 11:57:44 AM UTC-4, Brian wrote:
>
> Ash,
>
> JSON is a natural for Kibana's Javascript to read and therefore emit as
> CSV. So what I really was asking is Kibana going to become a serious
> conteder and allow user-written commands to be inserted into the pipeline
> between data query/response and charting. After my few weeks with R, I have
> gotten it to far exceed GNUPlot for plotting (even with the base plotting
> functions; I haven't yet dived into ggplot2 package), and to also far
> exceeds Kibana. For example, setting up a custom dashboard is tedious, and
> it's not easily customizable.
>
> Now, I am not suggesting that the ELK stack turn into Splunk directly. But
> since it wants to become a serious contender, I an am strongly recommending
> that the ELK team take the next step and allow a user-written command to be
> run against the Kibana output and its charting. And I recommend that the
> output be CSV because that's what R supports so naturally. And with R, I
> can build out custom analysis scripts that are flexible (and not hard-coded
> like Kibana dashboards).
>
> For example, I have an R script that gives me the most-commonly used
> functions that the Splunk timechart command offers. And with all of its
> customizability: Selecting the fields to use as the analysis, the by field
> (for example, plotting response time by host name), the statistics (mean,
> max, 95th percentile, and so on), even splitting the colors so that the
> plot instantly shows the distribution of load across 10 hosts that reside
> within two data centers.
>
> This is an excellent (and free) book that shows what Splunk can do by way
> of clear examples:
>
> http://www.splunk.com/goto/book
>
> Again, I don't suggest that Kibana duplicate this. But I strongly suggest
> that Kibana gives me a way to insert my own commands into the processing so
> that I can implement the specific functions that our group requires, and
> can do it without my gorpy Perl script and copy-paste command mumbo-jumbo,
> and instead in a much more friendly and accessible way that even the PMs
> can run from their Windows laptops without touching the command line.
>
> And as my part of the bargain, I will use Perl, R, or whatever else is at
> my disposal to create custom commands that can run on the Kibana host and
> perform all of the analysis that our group needs.
>
> Brian
>
> On Wednesday, September 24, 2014 4:34:43 PM UTC-4, Ashit Kumar wrote:
>>
>> Brian,
>>
>> I like the direction you are going down and am trying to do that myself.
>> However, being a perl fledgling, I am still battling Dumper etc. I would
>> appreciate it if you could share your code to convert and ES query to CSV.
>> I want to use aggregations and print/report/graph results. Kibana is very
>> pretty and does the basics well, but I want to know who used web mail and
>> order it by volume of data sent by hour of day and either graph / tabulate
>> / csv out the result. I just cant see how to do that with Kibana.
>>
>> Thanks
>>
>> Ash
>>
>>
--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/c3d710ef-d49e-472d-8b8a-52c3860a06c7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.