You can simply have another relationship from GetKafka to MergeContent 
essentially allowing MergeContent to have the original message so it could be 
merged.
See attached image for example:

Oleg

[cid:1F6F4F5A-85C9-4809-AECD-645E4F5054D3]




On Oct 13, 2015, at 5:22 AM, yejug 
<[email protected]<mailto:[email protected]>> wrote:

Hi

I'm new in NiFi and trying to implement simple workflow:

1) */GetKafka /*> *original encoded message* (AVRO/MSGPACK/SOME ANOTHER
FORMAT...)
2) */ConvertFromXXXToJson /*> JSON message
3) */EvaluateJsonPath /*> read some fields *from decoded Json* and put into
FlowFile attribute (in my case I simple read /timestamp / field)
4) */MergeContent/* > merge *original messages* used /timestamp / attribute
as "correlation" attribute
5) */PutHDFS /*> save *batch of original messages* into some directory on
HDFS

The problem is: Yet after step 2) I loss original messages readed from
kafka, and only parsed JSONs are available.
It is possible via standard NiFI processors\services set access original
messages at step 4), without developing custom ConvertXXXtoJson processors
(in simple case copy-pasting) and transmit input flowFiles to new custom
relation ?



--
View this message in context: 
http://apache-nifi-developer-list.39713.n7.nabble.com/Ingest-Original-data-from-External-system-by-data-s-dependent-condition-tp3093.html
Sent from the Apache NiFi Developer List mailing list archive at 
Nabble.com<http://Nabble.com>.


Reply via email to