In addition to Chad's comments, as of NiFi 1.3.0 there is also a PutElasticsearchHttpRecord processor [1], which allows you to accept records in any format (you just have to set up the appropriate reader, such as an AvroReader in this case) and otherwise works like the PutElasticsearchHttp processor. I see you are using PutElasticsearch rather than PutElasticsearchHttp; if you have a large number of records in a flow file (such as a single flow file coming from QueryDatabaseTable, and you are using PutElasticsearch instead of PutElasticsearchHttp for performance reasons, you may find that PutElasticsearchHttpRecord outperforms even the native PutElasticsearch processor.
Regards, Matt [1] https://issues.apache.org/jira/browse/NIFI-4002 On Mon, Jun 19, 2017 at 2:44 PM, Chad Zobrisky <[email protected]> wrote: > Hello, > > The PutElasticSearch processor does expect the flow file to be in JSON > format. PutElasticSearch uses the Bulk java API(1), and other examples can > be seen here(2). The processor will create the correct structure as seen > in the bulk insert, but expects the incoming file to be the field-value > JSON mapping. An actual example can be seen here(3), showing the format of > the flow file that was inserted. > > Thanks, > Chad Zobrisky > www.nifi.rocks > > > > (1) > https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html > (2) > https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-index_.html > (3) http://www.nifi.rocks/executescript-groovy-example/ > > On Thu, Jun 15, 2017 at 8:48 AM zmichael <[email protected]> > wrote: > >> Hi, I'm trying to use Apache Nifi 1.3.0 to fetch data from a postgres >> database (and from file System also) and put them into ElasticSearch 5.2.2 >> (I've also try with elasticSearch 5.0.0 and 5.0.1 but the result is the >> same). So this is my Nifi dataflow for database: >> >> < >> http://apache-nifi-developer-list.39713.n7.nabble.com/file/n16198/nifi_dataflow.png >> > >> >> And this is the QueryDatabaseTable config: >> >> < >> http://apache-nifi-developer-list.39713.n7.nabble.com/file/n16198/nifi_config_1.png >> > >> >> And the PutElasticSearch5 config is >> >> < >> http://apache-nifi-developer-list.39713.n7.nabble.com/file/n16198/nifi_config_2.png >> > >> >> And the DBCPConnectionPool config >> >> < >> http://apache-nifi-developer-list.39713.n7.nabble.com/file/n16198/nifi_config_3.png >> > >> >> So when i run each of the processor, everything work as expected on the >> QueryDatabaseTable (The same on the GetFile processsor) but on my >> elasticsearch instance i got this exception: >> *org.elasticsearch.index.mapper.MapperParsingException: failed to parse* >> *Caused by: org.elasticsearch.common.compress.NotXContentException: >> Compressor detection can only be called on some xcontent bytes or >> compressed >> xcontent bytes >> at >> >> org.elasticsearch.common.compress.CompressorFactory.compressor(CompressorFactory.java:57) >> ~[elasticsearch-5.2.2.jar:5.2.2] >> at >> >> org.elasticsearch.common.xcontent.XContentHelper.createParser(XContentHelper.java:48) >> ~[elasticsearch-5.2.2.jar:5.2.2] >> at >> >> org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:62) >> ~[elasticsearch-5.2.2.jar:5.2.2]* as shown on this sreenshot >> >> < >> http://apache-nifi-developer-list.39713.n7.nabble.com/file/n16198/exception.png >> > >> >> >> The same exception with ElasticSearch 5.0.0 and 5.0.1. Even when i try the >> PutElasticSearch processor with ElasticSearch 2.4.0. >> >> So am i missing something in my config? Or did i need to convert the >> FlowFiles in a particular format (JSON for exemple) to get it worked with >> ElasticSearch? >> Please help me. :-) >> >> >> >> >> -- >> View this message in context: >> http://apache-nifi-developer-list.39713.n7.nabble.com/PutElasticSearch5-error-tp16198.html >> Sent from the Apache NiFi Developer List mailing list archive at >> Nabble.com. >>
