Can you provide a sample JSON output from your ConvertAvroToJson processor?
It could help identify the location of any mapping/parser exceptions.

Thanks,
Matt

On Thu, Apr 7, 2016 at 1:31 PM, Madhukar Thota <[email protected]>
wrote:

> I am able to construct the dataflow with the following processors
>
> ExecuteSQL--> ConvertAvrotoJson --> Elasticsearch.
>
> The problem i seeing is elasticsearch unable to index the data because of
> the Mapping parser exceptions.
>
> 13:27:37 EDT
> ERROR
> fc43fc28-215c-469a-9908-73d04d98d4c2
>
> PutElasticsearch[id=fc43fc28-215c-469a-9908-73d04d98d4c2] Failed to insert 
> StandardFlowFileRecord[uuid=02af852b-bdf7-452f-a320-b23753c13389,claim=StandardContentClaim
>  [resourceClaim=StandardResourceClaim[id=1460050039787-4636, 
> container=default, section=540], offset=0, 
> length=697677],offset=0,name=1386348391725491,size=697677] into Elasticsearch 
> due to MapperParsingException[failed to parse]; nested: 
> NotSerializableExceptionWrapper[not_x_content_exception: Compressor detection 
> can only be called on some xcontent bytes or compressed xcontent bytes];, 
> transferring to failure
>
>
>
>
> Am i  doing anything wrong here or do i need extra processor to convert
> into right format what elasticsearch understands?
>
>
>
> On Thu, Apr 7, 2016 at 7:49 AM, Madhukar Thota <[email protected]>
> wrote:
>
>> Friends,
>>
>> I am exploring ExecuteSQL processor in nifi and my goal to get sql data
>> ingested in Elasticsearch.
>>
>> Can someone share or guide what's the flow looks like?
>>
>>
>> Thanks in Advance.
>>
>
>

Reply via email to