Hi Jose, thanks for reply!

Indeed, today the index is in template format. But only ossec index, the 
index ossecall did not work, the fields still appear as "Analyzed Field".

I did not do the procedure:
$ Cd ~ / ossec_tmp / ossec-wazuh / extensions / ElasticSearch / && curl 
-XPUT "http: // localhost: 9200 / _template / ossec /" -d "@ elastic-ossec-
template.json"

Just put the logstash output that I said.

But the file "template =>" /etc/logstash/elastic-ossec-template*2*.json "I 
modified the lines 3 and 8.
Line 3: *from* "template", "ossec *" *to* "template", "ossecall *"
Line 8: *from *"ossec": *to *"ossecall":

I do not know if it was really necessary to do this. I did this because I 
decided to create a separate index for logs archives.json file. Where ossec are 
logging all.

About "After that, probably you will need to reindex all your index to 
apply the new template."
Do you have any procedure to do this?


Em quarta-feira, 28 de setembro de 2016 18:01:12 UTC-3, jose escreveu:
>
> Hi Roberto,
>
> Have you applied the custom mapping?
>
>
> http://documentation.wazuh.com/en/latest/ossec_elk_elasticsearch.html#ossec-alerts-template
>
> If you have the custom mapping applied, and the template in Logstash, you 
> need to wait until next day, when the next index is created with the new 
> mapping and template.
>
> After that, probably you will need to reindex all your index to apply the 
> new template.
>
>
> Regards
> -----------------------
> Jose Luis Ruiz
> Wazuh Inc.
> [email protected] <javascript:>
>
> On September 28, 2016 at 3:26:38 PM, [email protected] 
> <javascript:> ([email protected] <javascript:>) wrote:
>
> Hi Pedro!
>
> I am using the ossec wazuh, I have a question about indexes.
> I had implemented the logstash without using the file "elastic-ossec-
> template.json". But I saw it would be good to use it. I am wanting use 
> some indexes and Kibana shows "Analyzed Field", like "AgentName".
>
> I put the template in the configuration of logstash and the index has not 
> changed to "not analized".
>
>
> My logstash output :
>
> output {
>
>  #for archives.json log
>  if [type] == "ossecall" {
>    elasticsearch {
>    hosts => "127.0.0.1:9200"
>    index => "ossecall-%{+YYYY.MM.dd}"
>    document_type => "ossecall"
>    template => "/etc/logstash/elastic-ossec-template2.json"
>    template_name => "ossecall"
>    template_overwrite => true
>    }
> }
>  #for alerts.json log
>  else {
>  elasticsearch {
>   hosts => "127.0.0.1:9200"
>   index => "ossec-%{+YYYY.MM.dd}"
>   document_type => "ossec"
>   template => "/etc/logstash/elastic-ossec-template.json"
>   template_name => "ossec"
>   template_overwrite => true
>   }
>   }
> }
>
> Can you help me?
>
>
>
> Em quinta-feira, 2 de junho de 2016 08:25:09 UTC-3, Pedro S escreveu: 
>>
>> Hi Maxim,  
>>
>> How are you forwarding the alerts/archives to Kibana?
>>
>> I think you will need the archives JSON output setting, if you are using 
>> Wazuh <http://wazuh.com/>, edit *ossec.conf* and add the following 
>> setting:
>>
>>   <global>
>>>     *<logall_json>yes</logall_json>*
>>>   </global>
>>
>>
>>
>> Once you do it, you will find new archives.json events files at:
>>
>> /var/ossec/logs/archives/archives.json
>>
>>
>>
>> The next step is forward these archives events to Elasticsearch, in order 
>> to do it we need to edit Logstash configuration.
>>
>> My personal advice to index archives events is to create a dedicated 
>> index pattern just for them, so you will be able to distinguish between 
>> events and alerts, adding inside "output" section the following 
>> configuration:
>>
>> output {
>>     if [type] == "ossec-alerts" {
>>         elasticsearch {
>>              hosts => ["127.0.0.1:9200"]
>>              index => "ossec-%{+YYYY.MM.dd}"
>>              document_type => "ossec"
>>              template => "/etc/logstash/elastic-ossec-template.json"
>>              template_name => "ossec"
>>              template_overwrite => true
>>         }
>>     }
>>     if [type] == "ossec-archives" {
>>         elasticsearch {
>>              hosts => ["127.0.0.1:9200"]
>>              index => "ossec-archives-%{+YYYY.MM.dd}"
>>              document_type => "ossec"
>>              template => "/etc/logstash/elastic-ossec-template.json"
>>              template_name => "ossec"
>>              template_overwrite => true
>>         }
>>     }
>> }
>>
>>
>> Later in Kibana you will need to create a new index pattern 
>> (Settings->indices) matching for "ossec-archives-*".
>>
>> If you need to "reindex" or read the a log file from the beginning using 
>> Logstash, you can use the file input with option *start_position* set to 
>> *beginning* (+ info) 
>> <https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html#plugins-inputs-file-start_position>
>>
>>
>>
>> On Monday, May 30, 2016 at 4:53:10 PM UTC+2, Maxim Surdu wrote: 
>>>
>>> i have this archives files with logs but in kibana i can not see them 
>>> can i reindex this files?
>>> if i can, please help me step by step
>>>
>>> joi, 19 mai 2016, 10:17:51 UTC+3, Maxim Surdu a scris: 
>>>>
>>>> Hi dear community,
>>>>
>>>> i had a problem with logstash, after i resolve it i saw what in kibana 
>>>> are missing logs, how can i resolve the problem and reindexing all my logs 
>>>> to kibana
>>>> I will be thankful if someone will help me step by step
>>>>
>>>>
>>>> i appreciate your help, and a lot of respect for developers and 
>>>> community!
>>>>
>>> --
>
> ---
> You received this message because you are subscribed to the Google Groups 
> "ossec-list" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to [email protected] <javascript:>.
> For more options, visit https://groups.google.com/d/optout.
>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"ossec-list" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to