is there any documentation to create new sensor in metron?

On Wed, 18 Oct 2017 at 01.22 Simon Elliston Ball <
si...@simonellistonball.com> wrote:

> Best bet there is to create a new sensor config using the grok parser
> type. So you would for example have a kafka topic called host_dhcp and a
> sensor called host_dhcp with the relevant grok pattern.
>
> Simon
>
>
> On 17 Oct 2017, at 19:19, Youzha <yuza.ras...@gmail.com> wrote:
>
> that’s what i mean.
> what sensor that i need if i want to do this case?
> especially when i wanna parse some host logs into metron enrichment and
> indexing
>
> On Wed, 18 Oct 2017 at 01.03 Simon Elliston Ball <
> si...@simonellistonball.com> wrote:
>
>> What you want to do in this setting is just TailFile, the just push to
>> Kafka. The grok piece is more efficiently handled in the Metron grok parser.
>>
>> Push to a kafka topic named for your sensor, then setup a sensor (a
>> parser topology to do the grok parsing and any transformation you need).
>> Each sensor gets its own parser topology.
>>
>> Simon
>>
>>
>> On 17 Oct 2017, at 19:00, Youzha <yuza.ras...@gmail.com> wrote:
>>
>> after nifi procces :
>>
>> TAILFILE -> TRANSFORM_TO_GROK -> PUSH_KAFKA
>>
>> what metron topology that i can use to procces the data in kafka? so it
>> can be enrichment by metron. i’ve check the article about adding new
>> telemetry source with squid, there is a squid topology that will ingest
>> from the squid topic in kafka and then put on enrichment kafka topic.
>> so how about my use case above? is there any topology that i can use?
>>
>> On Wed, 18 Oct 2017 at 00.30 Otto Fowler <ottobackwa...@gmail.com> wrote:
>>
>>> So,
>>> There are several options parsing the data and enriching.
>>>
>>> 1.  A native parser ( java ), which you have noticed is not there
>>> 2.  An instance of the GROK parser, with GROK rules that parser the input
>>> 3.  If it is CSV an instance of the CSV parser
>>> 4.  If it is JSON an instance of the JSONMap parser
>>>
>>> If these cannot be applied to your file then your options are:
>>>
>>> 1.  Write or open a jira for a native parser
>>> 2. find a way to transform your data to one of the above formats, so you
>>> can use those parsers.  This again is where nifi can help.  Something like:
>>>
>>>
>>> [nifi]
>>>
>>> TAILFILE -> TRANSFORM_TO_JSON -> PUSH_KAFKA
>>>
>>> where TRANSFORM_TO_JSON is a script processor or something built in
>>> depending on your format.
>>>
>>>
>>>
>>> On October 17, 2017 at 13:16:05, Youzha (yuza.ras...@gmail.com) wrote:
>>>
>>> Hi Lauren thx for your reply,
>>>
>>> yeah your suggestion absolutely right. i was able to ingest the logs to
>>> kafka. but how metron can enrich and index all of it? i think there are
>>> only  bro, snort, yaf, snort, pcap, websphere topology storm on metron for
>>> parsers. so, how metron can read the logs telemetry and proccess it so i
>>> can use it to event correlation
>>>
>>> On Tue, 17 Oct 2017 at 23.11 Laurens Vets <laur...@daemon.be> wrote:
>>>
>>>> Hi Youzha,
>>>>
>>>> Either check how the snort logs on the full dev installation are
>>>> ingested (I believe it's with a script) or check the Apache NiFi project
>>>> which makes it very easy to read logs from almost any format and ingest
>>>> them to Metron via Kafka.
>>>>
>>>> On 2017-10-17 08:53, Youzha wrote:
>>>>
>>>> is it possible to ingest other logs like /var/log/secure for example to
>>>> be new telemetry on metron? i've seen the metron architecture on the
>>>> website like picture below. host logs, email, av, etc can be telemetry
>>>> event buffer on metron. if this possible, could you give me some suggestion
>>>> how to do it ?
>>>>
>>>>
>>>> On Tue, 17 Oct 2017 at 21.00 Nick Allen <n...@nickallen.org> wrote:
>>>>
>>>>> If you want to look at failed login attempts for each user over time,
>>>>> then the Profiler might be a good solution.  Your profile will depend on
>>>>> the fields available in your telemetry, but it would look something like
>>>>> this, as an example.
>>>>>
>>>>>
>>>>> {
>>>>>   "profile": "failed-logins",
>>>>>   "foreach": "user.name",
>>>>>   "onlyif": "source.type == 'activedirectory' and event.type ==
>>>>> 'failed_login'"
>>>>>   "init": { "count": 0 },
>>>>>   "update": { "count" : "count + 1" },
>>>>>   "result": "count"
>>>>> }
>>>>>
>>>>>
>>>>> You can find an introduction and more information on using the
>>>>> Profiler below.
>>>>> *
>>>>> https://github.com/apache/metron/tree/master/metron-analytics/metron-profiler
>>>>> * https://www.slideshare.net/secret/GFBf2RTXBG35PB
>>>>>
>>>>> Best of luck
>>>>>
>>>>> On Tue, Oct 17, 2017 at 4:51 AM, tkg_cangkul <yuza.ras...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> for example,
>>>>>>
>>>>>> i wanna try to correlate between logs.
>>>>>> how many times user A have login failed and how many times user A
>>>>>> have login succeed. include detail IP, timestamp etc.
>>>>>> is this possible to do with metron?
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On 17/10/17 02:56, James Sirota wrote:
>>>>>>
>>>>>>> What specifically are you looking to correlate?  Can you talk a
>>>>>>> little more about your use case?
>>>>>>>
>>>>>>> 16.10.2017, 02:23, "tkg_cangkul" <yuza.ras...@gmail.com>:
>>>>>>>
>>>>>>>> hi,
>>>>>>>>
>>>>>>>> anyone could explain me about event correlation using apache metron?
>>>>>>>> does metron support event correlation?
>>>>>>>>
>>>>>>>> Pls Advice
>>>>>>>
>>>>>>> -------------------
>>>>>>> Thank you,
>>>>>>>
>>>>>>> James Sirota
>>>>>>> PMC- Apache Metron
>>>>>>> jsirota AT apache DOT org
>>>>>>
>>>>>>
>>>>

Reply via email to