I understand this may depend on a lot of factors, but I am curious on what is an efficient number of indexes for a large data set.
I would like to break up indexes by user and by date (I think) mostly because it will make data management easier on my end. I am wondering when Elasticsearch will have issues with the number of indexes. For example is 10 a good number? 100? 1000? 10000? etc. I would like to break up the indexes as much as possible and make use of aliases for searching the data of interest, but I don't want to create so many indexes that it will have an adverse affect on performance. I would appreciate any insight into what is recommended and what others have experienced. Thanks in advance. -Kevin -- You received this message because you are subscribed to the Google Groups "elasticsearch" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/71c384b4-0dc8-4c98-8ef2-5b00872754e7%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
