RE: monitoring solr logs

2014-01-02 Thread elmerfudd
Is it possible to monitor all the nodes in a solrcloud together, that way?



--
View this message in context: 
http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4109067.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: monitoring solr logs

2014-01-02 Thread Otis Gospodnetic
Hi,

Absolutely!
In case of SPM, you'd put the little SPM Client on each node you want to
monitor.  For shippings logs you can use a number of method -
https://sematext.atlassian.net/wiki/display/PUBLOGSENE/

Otis
--
Performance Monitoring * Log Analytics * Search Analytics
Solr  Elasticsearch Support * http://sematext.com/


On Thu, Jan 2, 2014 at 5:22 AM, elmerfudd na...@012.net.il wrote:

 Is it possible to monitor all the nodes in a solrcloud together, that way?



 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4109067.html
 Sent from the Solr - User mailing list archive at Nabble.com.



monitoring solr logs

2013-12-30 Thread adfel70
hi
i'm trying to figure out which solr and zookeeper logs i should monitor and
collect.
All the logs will be written to a file but I want to collect some of them
with logstash in order to be able to analyze them efficiently.
any inputs on logs of which classes i should collect?

thanks.




--
View this message in context: 
http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: monitoring solr logs

2013-12-30 Thread Tim Potter
I'm using logstash4solr (http://logstash4solr.org) for something similar ...

I setup my Solr to use Log4J by passing the following on the command-line when 
starting Solr: 
-Dlog4j.configuration=file:///$SCRIPT_DIR/log4j.properties

Then I use a custom Log4J appender that writes to RabbitMQ: 

https://github.com/plant42/rabbitmq-log4j-appender

You can then configure a RabbitMQ input for logstash - 
http://logstash.net/docs/1.3.2/inputs/rabbitmq

This decouples the log writes from log indexing in logstash4solr, which scales 
better for active Solr installations.

Btw ... I just log everything from Solr using this approach but you can use 
standard Log4J configuration settings to limit which classes / log levels to 
send to the RabbitMQ appender.

Cheers,

Timothy Potter
Sr. Software Engineer, LucidWorks
www.lucidworks.com


From: adfel70 adfe...@gmail.com
Sent: Monday, December 30, 2013 8:15 AM
To: solr-user@lucene.apache.org
Subject: monitoring solr logs

hi
i'm trying to figure out which solr and zookeeper logs i should monitor and
collect.
All the logs will be written to a file but I want to collect some of them
with logstash in order to be able to analyze them efficiently.
any inputs on logs of which classes i should collect?

thanks.




--
View this message in context: 
http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721.html
Sent from the Solr - User mailing list archive at Nabble.com.

RE: monitoring solr logs

2013-12-30 Thread adfel70
Actually I was considering using logstash4solr, but it didn't seem mature
enough.
does it work fine? any known bugs?

are you collecting the logs in the same solr cluster you use for the
production systems?
if so, what will you do if for some reason solr is down and you would like
to analyze the logs to see what happend? 

btw, i started a new solr cluster with 7 shards, replicationfactor=3 and run
indexing job of 400K docs,
it got stuck on 150K because I used Socketappender directly to write to
logstash and logstash disk got full.

that's why I moved to using AsyncAppender, and I plan on moving to using
rabbit.
but this is also why I wanted to filter some of the logs. indexing 150K docs
prodcued 50GB of logs.
this seemed too much.




Tim Potter wrote
 I'm using logstash4solr (http://logstash4solr.org) for something similar
 ...
 
 I setup my Solr to use Log4J by passing the following on the command-line
 when starting Solr: 
 -Dlog4j.configuration=file:///$SCRIPT_DIR/log4j.properties
 
 Then I use a custom Log4J appender that writes to RabbitMQ: 
 
 https://github.com/plant42/rabbitmq-log4j-appender
 
 You can then configure a RabbitMQ input for logstash -
 http://logstash.net/docs/1.3.2/inputs/rabbitmq
 
 This decouples the log writes from log indexing in logstash4solr, which
 scales better for active Solr installations.
 
 Btw ... I just log everything from Solr using this approach but you can
 use standard Log4J configuration settings to limit which classes / log
 levels to send to the RabbitMQ appender.
 
 Cheers,
 
 Timothy Potter
 Sr. Software Engineer, LucidWorks
 www.lucidworks.com
 
 
 From: adfel70 lt;

 adfel70@

 gt;
 Sent: Monday, December 30, 2013 8:15 AM
 To: 

 solr-user@.apache

 Subject: monitoring solr logs
 
 hi
 i'm trying to figure out which solr and zookeeper logs i should monitor
 and
 collect.
 All the logs will be written to a file but I want to collect some of them
 with logstash in order to be able to analyze them efficiently.
 any inputs on logs of which classes i should collect?
 
 thanks.
 
 
 
 
 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721.html
 Sent from the Solr - User mailing list archive at Nabble.com.





--
View this message in context: 
http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4108737.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: monitoring solr logs

2013-12-30 Thread Tim Potter
We're (LucidWorks) are actively developing on logstash4solr so if you have 
issues, let us know. So far, so good for me but I upgraded to logstash 1.3.2 
even though the logstash4solr version includes 1.2.2 you can use the newer one. 
I'm not quite in production with my logstash4solr - rabbit-mq - log4j - Solr 
solution yet though ;-)

Yeah, 50GB is too much logging for only 150K docs. Maybe start by filtering by 
log level (WARN and more severe). If a server crashes, you're likely to see 
some errors in the logstash side but sometimes you may have to SSH to the 
specific box and look at the local log (so definitely append all messages to 
the local Solr log too), I'm using something like the following for local 
logging:

log4j.rootLogger=INFO, file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.MaxFileSize=50MB
log4j.appender.file.MaxBackupIndex=10
log4j.appender.file.File=logs/solr.log
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{ISO8601} [%t] %-5p %c{3} %x - 
%m%n


Timothy Potter
Sr. Software Engineer, LucidWorks
www.lucidworks.com


From: adfel70 adfe...@gmail.com
Sent: Monday, December 30, 2013 9:34 AM
To: solr-user@lucene.apache.org
Subject: RE: monitoring solr logs

Actually I was considering using logstash4solr, but it didn't seem mature
enough.
does it work fine? any known bugs?

are you collecting the logs in the same solr cluster you use for the
production systems?
if so, what will you do if for some reason solr is down and you would like
to analyze the logs to see what happend?

btw, i started a new solr cluster with 7 shards, replicationfactor=3 and run
indexing job of 400K docs,
it got stuck on 150K because I used Socketappender directly to write to
logstash and logstash disk got full.

that's why I moved to using AsyncAppender, and I plan on moving to using
rabbit.
but this is also why I wanted to filter some of the logs. indexing 150K docs
prodcued 50GB of logs.
this seemed too much.




Tim Potter wrote
 I'm using logstash4solr (http://logstash4solr.org) for something similar
 ...

 I setup my Solr to use Log4J by passing the following on the command-line
 when starting Solr:
 -Dlog4j.configuration=file:///$SCRIPT_DIR/log4j.properties

 Then I use a custom Log4J appender that writes to RabbitMQ:

 https://github.com/plant42/rabbitmq-log4j-appender

 You can then configure a RabbitMQ input for logstash -
 http://logstash.net/docs/1.3.2/inputs/rabbitmq

 This decouples the log writes from log indexing in logstash4solr, which
 scales better for active Solr installations.

 Btw ... I just log everything from Solr using this approach but you can
 use standard Log4J configuration settings to limit which classes / log
 levels to send to the RabbitMQ appender.

 Cheers,

 Timothy Potter
 Sr. Software Engineer, LucidWorks
 www.lucidworks.com

 
 From: adfel70 lt;

 adfel70@

 gt;
 Sent: Monday, December 30, 2013 8:15 AM
 To:

 solr-user@.apache

 Subject: monitoring solr logs

 hi
 i'm trying to figure out which solr and zookeeper logs i should monitor
 and
 collect.
 All the logs will be written to a file but I want to collect some of them
 with logstash in order to be able to analyze them efficiently.
 any inputs on logs of which classes i should collect?

 thanks.




 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721.html
 Sent from the Solr - User mailing list archive at Nabble.com.





--
View this message in context: 
http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4108737.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: monitoring solr logs

2013-12-30 Thread adfel70
And are you using any tool like kibana as a dashboard for the logs?



Tim Potter wrote
 We're (LucidWorks) are actively developing on logstash4solr so if you have
 issues, let us know. So far, so good for me but I upgraded to logstash
 1.3.2 even though the logstash4solr version includes 1.2.2 you can use the
 newer one. I'm not quite in production with my logstash4solr - rabbit-mq
 - log4j - Solr solution yet though ;-)
 
 Yeah, 50GB is too much logging for only 150K docs. Maybe start by
 filtering by log level (WARN and more severe). If a server crashes, you're
 likely to see some errors in the logstash side but sometimes you may have
 to SSH to the specific box and look at the local log (so definitely append
 all messages to the local Solr log too), I'm using something like the
 following for local logging:
 
 log4j.rootLogger=INFO, file
 log4j.appender.file=org.apache.log4j.RollingFileAppender
 log4j.appender.file.MaxFileSize=50MB
 log4j.appender.file.MaxBackupIndex=10
 log4j.appender.file.File=logs/solr.log
 log4j.appender.file.layout=org.apache.log4j.PatternLayout
 log4j.appender.file.layout.ConversionPattern=%d{ISO8601} [%t] %-5p %c{3}
 %x - %m%n
 
 
 Timothy Potter
 Sr. Software Engineer, LucidWorks
 www.lucidworks.com
 
 
 From: adfel70 lt;

 adfel70@

 gt;
 Sent: Monday, December 30, 2013 9:34 AM
 To: 

 solr-user@.apache

 Subject: RE: monitoring solr logs
 
 Actually I was considering using logstash4solr, but it didn't seem mature
 enough.
 does it work fine? any known bugs?
 
 are you collecting the logs in the same solr cluster you use for the
 production systems?
 if so, what will you do if for some reason solr is down and you would like
 to analyze the logs to see what happend?
 
 btw, i started a new solr cluster with 7 shards, replicationfactor=3 and
 run
 indexing job of 400K docs,
 it got stuck on 150K because I used Socketappender directly to write to
 logstash and logstash disk got full.
 
 that's why I moved to using AsyncAppender, and I plan on moving to using
 rabbit.
 but this is also why I wanted to filter some of the logs. indexing 150K
 docs
 prodcued 50GB of logs.
 this seemed too much.
 
 
 
 
 Tim Potter wrote
 I'm using logstash4solr (http://logstash4solr.org) for something similar
 ...

 I setup my Solr to use Log4J by passing the following on the command-line
 when starting Solr:
 -Dlog4j.configuration=file:///$SCRIPT_DIR/log4j.properties

 Then I use a custom Log4J appender that writes to RabbitMQ:

 https://github.com/plant42/rabbitmq-log4j-appender

 You can then configure a RabbitMQ input for logstash -
 http://logstash.net/docs/1.3.2/inputs/rabbitmq

 This decouples the log writes from log indexing in logstash4solr, which
 scales better for active Solr installations.

 Btw ... I just log everything from Solr using this approach but you can
 use standard Log4J configuration settings to limit which classes / log
 levels to send to the RabbitMQ appender.

 Cheers,

 Timothy Potter
 Sr. Software Engineer, LucidWorks
 www.lucidworks.com

 
 From: adfel70 lt;
 
 adfel70@
 
 gt;
 Sent: Monday, December 30, 2013 8:15 AM
 To:
 
 solr-user@.apache
 
 Subject: monitoring solr logs

 hi
 i'm trying to figure out which solr and zookeeper logs i should monitor
 and
 collect.
 All the logs will be written to a file but I want to collect some of them
 with logstash in order to be able to analyze them efficiently.
 any inputs on logs of which classes i should collect?

 thanks.




 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721.html
 Sent from the Solr - User mailing list archive at Nabble.com.
 
 
 
 
 
 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4108737.html
 Sent from the Solr - User mailing list archive at Nabble.com.





--
View this message in context: 
http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4108744.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: monitoring solr logs

2013-12-30 Thread Tim Potter
I've just been using the Solr query form so far :P but have plans to try out 
Kibana too. Let me know how that goes for you and I'll do the same.


From: adfel70 adfe...@gmail.com
Sent: Monday, December 30, 2013 10:06 AM
To: solr-user@lucene.apache.org
Subject: RE: monitoring solr logs

And are you using any tool like kibana as a dashboard for the logs?



Tim Potter wrote
 We're (LucidWorks) are actively developing on logstash4solr so if you have
 issues, let us know. So far, so good for me but I upgraded to logstash
 1.3.2 even though the logstash4solr version includes 1.2.2 you can use the
 newer one. I'm not quite in production with my logstash4solr - rabbit-mq
 - log4j - Solr solution yet though ;-)

 Yeah, 50GB is too much logging for only 150K docs. Maybe start by
 filtering by log level (WARN and more severe). If a server crashes, you're
 likely to see some errors in the logstash side but sometimes you may have
 to SSH to the specific box and look at the local log (so definitely append
 all messages to the local Solr log too), I'm using something like the
 following for local logging:

 log4j.rootLogger=INFO, file
 log4j.appender.file=org.apache.log4j.RollingFileAppender
 log4j.appender.file.MaxFileSize=50MB
 log4j.appender.file.MaxBackupIndex=10
 log4j.appender.file.File=logs/solr.log
 log4j.appender.file.layout=org.apache.log4j.PatternLayout
 log4j.appender.file.layout.ConversionPattern=%d{ISO8601} [%t] %-5p %c{3}
 %x - %m%n


 Timothy Potter
 Sr. Software Engineer, LucidWorks
 www.lucidworks.com

 
 From: adfel70 lt;

 adfel70@

 gt;
 Sent: Monday, December 30, 2013 9:34 AM
 To:

 solr-user@.apache

 Subject: RE: monitoring solr logs

 Actually I was considering using logstash4solr, but it didn't seem mature
 enough.
 does it work fine? any known bugs?

 are you collecting the logs in the same solr cluster you use for the
 production systems?
 if so, what will you do if for some reason solr is down and you would like
 to analyze the logs to see what happend?

 btw, i started a new solr cluster with 7 shards, replicationfactor=3 and
 run
 indexing job of 400K docs,
 it got stuck on 150K because I used Socketappender directly to write to
 logstash and logstash disk got full.

 that's why I moved to using AsyncAppender, and I plan on moving to using
 rabbit.
 but this is also why I wanted to filter some of the logs. indexing 150K
 docs
 prodcued 50GB of logs.
 this seemed too much.




 Tim Potter wrote
 I'm using logstash4solr (http://logstash4solr.org) for something similar
 ...

 I setup my Solr to use Log4J by passing the following on the command-line
 when starting Solr:
 -Dlog4j.configuration=file:///$SCRIPT_DIR/log4j.properties

 Then I use a custom Log4J appender that writes to RabbitMQ:

 https://github.com/plant42/rabbitmq-log4j-appender

 You can then configure a RabbitMQ input for logstash -
 http://logstash.net/docs/1.3.2/inputs/rabbitmq

 This decouples the log writes from log indexing in logstash4solr, which
 scales better for active Solr installations.

 Btw ... I just log everything from Solr using this approach but you can
 use standard Log4J configuration settings to limit which classes / log
 levels to send to the RabbitMQ appender.

 Cheers,

 Timothy Potter
 Sr. Software Engineer, LucidWorks
 www.lucidworks.com

 
 From: adfel70 lt;

 adfel70@

 gt;
 Sent: Monday, December 30, 2013 8:15 AM
 To:

 solr-user@.apache

 Subject: monitoring solr logs

 hi
 i'm trying to figure out which solr and zookeeper logs i should monitor
 and
 collect.
 All the logs will be written to a file but I want to collect some of them
 with logstash in order to be able to analyze them efficiently.
 any inputs on logs of which classes i should collect?

 thanks.




 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721.html
 Sent from the Solr - User mailing list archive at Nabble.com.





 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4108737.html
 Sent from the Solr - User mailing list archive at Nabble.com.





--
View this message in context: 
http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4108744.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: monitoring solr logs

2013-12-30 Thread Otis Gospodnetic
Hi,

You should look at Logsene:
http://sematext.com/logsene (free)

It has Kibana + native UI, it's not limited to logs, and if you are
monitoring your Solr and/or Zookeeper with SPM you can have your
performance metrics graphs and your logs side by side for much more
efficient troubleshooting.

Otis
Solr  ElasticSearch Support
http://sematext.com/
On Dec 30, 2013 12:06 PM, adfel70 adfe...@gmail.com wrote:

 And are you using any tool like kibana as a dashboard for the logs?



 Tim Potter wrote
  We're (LucidWorks) are actively developing on logstash4solr so if you
 have
  issues, let us know. So far, so good for me but I upgraded to logstash
  1.3.2 even though the logstash4solr version includes 1.2.2 you can use
 the
  newer one. I'm not quite in production with my logstash4solr - rabbit-mq
  - log4j - Solr solution yet though ;-)
 
  Yeah, 50GB is too much logging for only 150K docs. Maybe start by
  filtering by log level (WARN and more severe). If a server crashes,
 you're
  likely to see some errors in the logstash side but sometimes you may have
  to SSH to the specific box and look at the local log (so definitely
 append
  all messages to the local Solr log too), I'm using something like the
  following for local logging:
 
  log4j.rootLogger=INFO, file
  log4j.appender.file=org.apache.log4j.RollingFileAppender
  log4j.appender.file.MaxFileSize=50MB
  log4j.appender.file.MaxBackupIndex=10
  log4j.appender.file.File=logs/solr.log
  log4j.appender.file.layout=org.apache.log4j.PatternLayout
  log4j.appender.file.layout.ConversionPattern=%d{ISO8601} [%t] %-5p %c{3}
  %x - %m%n
 
 
  Timothy Potter
  Sr. Software Engineer, LucidWorks
  www.lucidworks.com
 
  
  From: adfel70 lt;

  adfel70@

  gt;
  Sent: Monday, December 30, 2013 9:34 AM
  To:

  solr-user@.apache

  Subject: RE: monitoring solr logs
 
  Actually I was considering using logstash4solr, but it didn't seem mature
  enough.
  does it work fine? any known bugs?
 
  are you collecting the logs in the same solr cluster you use for the
  production systems?
  if so, what will you do if for some reason solr is down and you would
 like
  to analyze the logs to see what happend?
 
  btw, i started a new solr cluster with 7 shards, replicationfactor=3 and
  run
  indexing job of 400K docs,
  it got stuck on 150K because I used Socketappender directly to write to
  logstash and logstash disk got full.
 
  that's why I moved to using AsyncAppender, and I plan on moving to using
  rabbit.
  but this is also why I wanted to filter some of the logs. indexing 150K
  docs
  prodcued 50GB of logs.
  this seemed too much.
 
 
 
 
  Tim Potter wrote
  I'm using logstash4solr (http://logstash4solr.org) for something
 similar
  ...
 
  I setup my Solr to use Log4J by passing the following on the
 command-line
  when starting Solr:
  -Dlog4j.configuration=file:///$SCRIPT_DIR/log4j.properties
 
  Then I use a custom Log4J appender that writes to RabbitMQ:
 
  https://github.com/plant42/rabbitmq-log4j-appender
 
  You can then configure a RabbitMQ input for logstash -
  http://logstash.net/docs/1.3.2/inputs/rabbitmq
 
  This decouples the log writes from log indexing in logstash4solr, which
  scales better for active Solr installations.
 
  Btw ... I just log everything from Solr using this approach but you can
  use standard Log4J configuration settings to limit which classes / log
  levels to send to the RabbitMQ appender.
 
  Cheers,
 
  Timothy Potter
  Sr. Software Engineer, LucidWorks
  www.lucidworks.com
 
  
  From: adfel70 lt;
 
  adfel70@
 
  gt;
  Sent: Monday, December 30, 2013 8:15 AM
  To:
 
  solr-user@.apache
 
  Subject: monitoring solr logs
 
  hi
  i'm trying to figure out which solr and zookeeper logs i should monitor
  and
  collect.
  All the logs will be written to a file but I want to collect some of
 them
  with logstash in order to be able to analyze them efficiently.
  any inputs on logs of which classes i should collect?
 
  thanks.
 
 
 
 
  --
  View this message in context:
  http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721.html
  Sent from the Solr - User mailing list archive at Nabble.com.
 
 
 
 
 
  --
  View this message in context:
 
 http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4108737.html
  Sent from the Solr - User mailing list archive at Nabble.com.





 --
 View this message in context:
 http://lucene.472066.n3.nabble.com/monitoring-solr-logs-tp4108721p4108744.html
 Sent from the Solr - User mailing list archive at Nabble.com.