Hi

I'm new in NiFi and trying to implement simple workflow:

1) */GetKafka /*> *original encoded message* (AVRO/MSGPACK/SOME ANOTHER
FORMAT...)
2) */ConvertFromXXXToJson /*> JSON message
3) */EvaluateJsonPath /*> read some fields *from decoded Json* and put into
FlowFile attribute (in my case I simple read /timestamp / field)
4) */MergeContent/* > merge *original messages* used /timestamp / attribute
as "correlation" attribute
5) */PutHDFS /*> save *batch of original messages* into some directory on
HDFS

The problem is: Yet after step 2) I loss original messages readed from
kafka, and only parsed JSONs are available.
It is possible via standard NiFI processors\services set access original
messages at step 4), without developing custom ConvertXXXtoJson processors
(in simple case copy-pasting) and transmit input flowFiles to new custom
relation ?



--
View this message in context: 
http://apache-nifi-developer-list.39713.n7.nabble.com/Ingest-Original-data-from-External-system-by-data-s-dependent-condition-tp3093.html
Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.

Reply via email to