We collect lots of log events from our web servers. Because of compute 
limitations we are only able to keep 4 hours worth of log data in 
Elasticsearch (~100mil documents for 4 hours). 

I'd like to run some high level aggregation queries every 10 minutes and 
store the results in an aggregated index. I was going to hack together a 
python script and throw a cron job out there to accomplish this but it 
seems like rivers would be a good solution for this as well. Is there an 
Elasticsearch river for ..Elasticsearch? Is there a better way to run a 
query on an interval and store the results?

Thanks for the help,

j. 

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/d69d0205-4e8c-48ee-852d-a5c1016006a0%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to