Can ES scale to 30TB / day, and still be usable? This is a typical logstash/elasticsearch/kibana setup. I have a small environment logging 20GB / day that seems to work fine. At 30TB, very little will be able to cached into ram, can ES still be usable at that point?
Also, what's is the best way to pick the proper index creation rate (per day, per hour?). Is there a guideline for max. index size? -- You received this message because you are subscribed to the Google Groups "elasticsearch" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/865e67f5-bcd4-4247-9f39-813424d6747c%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
