Re: [graylog2] Full list of Permissions?

2016-08-10 Thread Pete GS
Awesome thanks Edmundo! That helps a great deal. Cheers, Pete On Tue, Aug 9, 2016 at 7:16 PM, Edmundo Alvarez wrote: > Hi Pete, > > Here are the permissions available in 2.0: https://github.com/Graylog2/ > graylog2-server/blob/2.0/graylog2-server/src/main/java/ >

[graylog2] Retention Period

2016-08-10 Thread Nathan Mace
What is the default retention period of data indexed by Graylog / Elasticsearch? All I can find is the graylog setting for "retention_strategy" being set to delete, which is fine. But I can't find what the time period is for that. Thanks! Nathan -- You received this message because you

[graylog2] Re: Use Graylog To Dedup Events?

2016-08-10 Thread Nathan Mace
I saw that, but was hoping there was more up to date / better news. That post on github is two years old. Nathan On Wednesday, August 10, 2016 at 6:23:34 AM UTC-4, Jochen Schalanda wrote: > > Hi Nathan, > > deduplication of messages is currently not supported by Graylog, see >

[graylog2] Question about sending ALL windows event log data

2016-08-10 Thread Jamie P
I wanted to make sure if the following config would have nxlog send all event logs on a Windows Server (Domain Controller or otherwise) to a graylog instance. ## This is a sample configuration file. See the nxlog reference manual about the ## configuration options. It should be installed

[graylog2] Re: /var/log/graylog/graylog.log

2016-08-10 Thread julioqc47
Alright that's pretty clear :) The doc could use some clarification about all of this structure but overall it was fairy simple to configure once you know what configures what! Thank you all for the assistance On Wednesday, August 10, 2016 at 2:56:58 PM UTC-4, Jochen Schalanda wrote: > > Hi

[graylog2] Re: /var/log/graylog/graylog.log

2016-08-10 Thread Jochen Schalanda
Hi Julio, On Wednesday, 10 August 2016 17:34:35 UTC+2, juli...@gmail.com wrote: > > So what is the point of the log4j2.xml file then if logs are configured > with svlogd (on the OVA image at least)? What log does it create? > The log4j2.xml file is only being used to configure Graylog's

[graylog2] kafka.common.KafkaException: Failed to acquire lock on file .lock in /var/opt/graylog/data/journal

2016-08-10 Thread Ralph @GM
[12:05] Our graylog instance won't start. /var/log/graylog/server/current shows the following failures... [12:05] 2016-08-10_16:56:49.58726 Caused by: kafka.common.KafkaException: Failed to acquire lock on file .lock in /var/opt/graylog/data/journal. A Kafka instance in another process or

[graylog2] Re: /var/log/graylog/graylog.log

2016-08-10 Thread julioqc47
Ok yes I found the logging.yml and the config matches with the output. I'm getting confused now... So what is the point of the log4j2.xml file then if logs are configured with svlogd (on the OVA image at least)? What log does it create? >From what I read here

Re: [graylog2] regex in pipeline

2016-08-10 Thread Edmundo Alvarez
Hi Avdhoot, Regular expressions in pipeline rules use the Java syntax, so you need to double escape backslashes. If you need more information about regexes in Java, please take a look at the documentation: https://docs.oracle.com/javase/8/docs/api/java/util/regex/Pattern.html Hope that helps.

[graylog2] Re: /var/log/graylog/graylog.log

2016-08-10 Thread Jochen Schalanda
Hi Julio, the log4j2.xml file is from Graylog but the log file you've described is from Elasticsearch. Please read the page I've linked to in my last post. Cheers, Jochen On Wednesday, 10 August 2016 15:18:44 UTC+2, juli...@gmail.com wrote: > > Ok ty! > I guess log4j2.xml is for that log file

[graylog2] Re: /var/log/graylog/graylog.log

2016-08-10 Thread julioqc47
Ok ty! I guess log4j2.xml is for that log file settings and svlogd.conf is for other internal logs. On Wednesday, August 10, 2016 at 6:25:40 AM UTC-4, Jochen Schalanda wrote: > > Hi Julio, > > the file you've mentioned is being generated by Elasticsearch and can be > configured in its logging

[graylog2] regex function in pipeline

2016-08-10 Thread Avdhoot Dendge
I am trying to match source field with regex. but graylog text editor not allowed me to save the rule. what is wrong with below rule? plese check attached file for graylog editor error. graylog version: 2.1.0-beta.2 regex sample: http://regexr.com/3dvvs rule "check prod logs" when

[graylog2] regex in pipeline

2016-08-10 Thread Avdhoot Dendge
I am trying to match message source file with regex in pipeline. but gralog text edtor not allowing me to save. what is wroing with belwo rule? check attached file for graylog editor error. rule "test" when has_field("source") then regex(pattern:

[graylog2] Re: sidecar-collector feature

2016-08-10 Thread Jochen Schalanda
For reference: https://github.com/Graylog2/collector-sidecar/issues/44 On Wednesday, 10 August 2016 06:09:16 UTC+2, Werner van der Merwe wrote: > > Currently in our setup we use a lot of Execs inside an input block. > Is the only way to have execs currently to rather create snippets? > > Would it

[graylog2] Re: My squid drool is not working

2016-08-10 Thread Jochen Schalanda
Hi Daniel, maybe using a normal Grok extractor would be sufficient for your needs, e. g. https://gist.github.com/MiteshShah/6879e09b6999d5c8e77c. Regarding your Drools rules file, please check the logs of your Graylog node for errors. Cheers, Jochen On Tuesday, 9 August 2016 18:31:44 UTC+2,

[graylog2] Re: /var/log/graylog/graylog.log

2016-08-10 Thread Jochen Schalanda
Hi Julio, the file you've mentioned is being generated by Elasticsearch and can be configured in its logging configuration. See https://www.elastic.co/guide/en/elasticsearch/reference/2.3/setup-configuration.html#logging for further details. Cheers, Jochen On Tuesday, 9 August 2016

[graylog2] Re: Use Graylog To Dedup Events?

2016-08-10 Thread Jochen Schalanda
Hi Nathan, deduplication of messages is currently not supported by Graylog, see https://github.com/Graylog2/graylog2-server/issues/466 for a related feature request. Cheers, Jochen On Tuesday, 9 August 2016 22:15:06 UTC+2, Nathan Mace wrote: > > In Splunk, it is easy to search for all of the