[ 
https://issues.apache.org/jira/browse/NIFI-2298?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15388667#comment-15388667
 ] 

ASF GitHub Bot commented on NIFI-2298:
--------------------------------------

Github user xmlking commented on the issue:

    https://github.com/apache/nifi/pull/687
  
    Yes. Thanks 
    
    Sent from my iPhone
    
    > On Jul 21, 2016, at 5:09 PM, Oleg Zhurakousky <[email protected]> 
wrote:
    > 
    > @xmlking Back Pressure is the function of all processors so there is 
nothing specific required in ConsumeKafka to handle it. Does that answer your 
question?
    > 
    > —
    > You are receiving this because you were mentioned.
    > Reply to this email directly, view it on GitHub, or mute the thread.
    > 



> Add missing futures for ConsumeKafka
> ------------------------------------
>
>                 Key: NIFI-2298
>                 URL: https://issues.apache.org/jira/browse/NIFI-2298
>             Project: Apache NiFi
>          Issue Type: Improvement
>          Components: Extensions
>    Affects Versions: 0.7.0
>            Reporter: sumanth chinthagunta
>            Assignee: Oleg Zhurakousky
>              Labels: kafka
>             Fix For: 1.0.0, 0.8.0
>
>
> The new ConsumeKafka processor  is missing some capabilities that were 
> present in old getKafka processor. 
> 1. New ConsumeKafka is not writing critical Kafka attributes  i.e., 
> kafka.key, kafka.offset, kafka.partition etc into flowFile attributes. 
> Old getKafka processor: 
> {quote}
> Standard FlowFile Attributes
> Key: 'entryDate'
>                Value: 'Sun Jul 17 15:17:00 CDT 2016'
> Key: 'lineageStartDate'
>                Value: 'Sun Jul 17 15:17:00 CDT 2016'
> Key: 'fileSize'
>                Value: '183'
> FlowFile Attribute Map Content
> Key: 'filename'
>                Value: '19709945781167274'
> Key: 'kafka.key'
>                Value: '\{"database":"test","table":"sc_job","pk.systemid":1\}'
> Key: 'kafka.offset'
>                Value: '1184010261'
> Key: 'kafka.partition'
>                Value: '0'
> Key: 'kafka.topic'
>                Value: ‘data'
> Key: 'path'
>                Value: './'
> Key: 'uuid'
>                Value: '244059bb-9ad9-4d74-b1fb-312eee72124a'
>  {quote}
>  
> New ConsumeKafka processor : 
>  {quote}
> Standard FlowFile Attributes
> Key: 'entryDate'
>                Value: 'Sun Jul 17 15:18:41 CDT 2016'
> Key: 'lineageStartDate'
>                Value: 'Sun Jul 17 15:18:41 CDT 2016'
> Key: 'fileSize'
>                Value: '183'
> FlowFile Attribute Map Content
> Key: 'filename'
>                Value: '19710046870478139'
> Key: 'path'
>                Value: './'
> Key: 'uuid'
>                Value: '349fbeb3-e342-4533-be4c-424793fa5c59’
> {quote}
> 2. getKafka/petKafka are compatible with Kafka 0.8.x and 0.9.x . 
> Please make new PublishKafka/ConsumeKafka processors based on Kafka 0.10 
> version. 
> 3. Support subscribing to multiple topics i.e., topic:  topic1,topic2 
> 4. Support configurable Serializer/DeSerializer for String, JSON , Avro etc. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to