Yes, that is possible, but we also monitor Apache, Tomcat, the JVM, and
OS through JMX and other live monitoring interfaces. Why invent a real-time
HTTP log analysis system when I can fetch /search/stats.jsp at any time?

By "number of rows fetched", do you mean "number of documents matched"?

The log you describe is pretty useful. Ultraseek has something similar
and that is the log most often used by admins. I'd recommend also
logging the start and rows part of the request so you can distinguish
between new queries and second page requests. If possible, make the
timestamp the same as the HTTP access log so you can correlate the
entries.

wunder

On 5/9/07 9:43 PM, "Ian Holsman" <[EMAIL PROTECTED]> wrote:
> 
> Walter Underwood wrote:
>> This is for monitoring -- what happened in the last 30 seconds.
>> Log file analysis doesn't really do that.
> 
> I would respectfully disagree.
> Log file analysis of each request can give you that, and a whole lot more.
> 
> you could either grab the stats via a regular cron job, or create a separate
> filter to parse them real time.
> It would then let you grab more sophisticated stats if you choose to.
> 
> What I would like to know is (and excuse the newbieness of the question) how
> to enable solr to log a file with the following data.
> 
> - time spent (ms) in the request.
> - IP# of the incoming request
> - what the request was (and what handler executed it)
> - a status code to signal if the request failed for some reasons
> - number of rows fetched
> and 
> - the number of rows actually returned
> 
> is this possible? (I'm using tomcat if that changes the answer).
> 
> regards
> Ian

Reply via email to