I understand both opinions, for give you a reference of my project, it is 
based in migrate data from oracle to ES for analyze it with kibana, 
specially are logs, millions of logs (around 90.000.000 of lines) and I'm 
looking for the best way to do that, I don't know how many cluster I have 
to use because is inserted data all days, I just to insert data like David 
explain me and I can fix the error that I have and all worked very well 
also I can see my data in Kibana. I have another question, how can I insert 
not only 1000 registers like I'm doing  now, else all my data that is 
around 90.000.000, like I told you before and this increment so fast. 

Thank you both guys.

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/30010282-64e5-4b94-9f9a-f16aa0a7659e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to