Gopi, You really have a CSV file but using ^ instead of , as your delimiter.
I happen to write my own CSV-to-JSON converter, giving it the options I needed (including specification or auto-detection of numbers, date format normalization, auto-creating of the action and meta data line, and so on). I did this before stumbling across logstash, but still found it easier to write and maintain this code myself. Choose the language you wish: I wrote one version of mine in C++ but the subsequent version in Java. I also wrote a bulk load client in Java to avoid the limitations of curl (and also its complete lack of existence on various platforms). (logstash is much better for log files; my converter is much better for generic CSV) I know this isn't exactly the pre-written tool you are looking for. But converting the CSV (with the option to override the delimiter values) into JSON isn't very hard to do. And once that's done, it's an easy matter to add the action and meta data and have a bulk-ready data stream. Brian On Wednesday, January 7, 2015 6:40:34 AM UTC-5, Gopimanikandan Sengodan wrote: > > Hi All, > > We are planning to load the data to elastic search from the delimited file. > > The file has been delimited with 0x88(ˆ) delimiter. > > Can you please let me know how to load the delimited file to Elastic? > > Also, Please let me know what is the best and fastest way to load the > millions of data to Elastic search? > > > SAMPLE: > > XXXXXˆYYYYYYˆZZZZ > > Thanks, > Gopi > -- You received this message because you are subscribed to the Google Groups "elasticsearch" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/c0e6be2e-d94c-4538-89d6-d7afdb6945af%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
