gaborgsomogyi commented on pull request #28623:
URL: https://github.com/apache/spark/pull/28623#issuecomment-634619571


   I've taken a look at it and here are my thoughts:
   * > According to KafkaConsumer specification, the consumer group in the 
"assign" strategy is not used.
   
   This is simply not true. One can commit back offests with a consumer and 
such case group id is used.
   This is not used in Structured Streaming at the moment but I don't want to 
close this possibility.
   * We've already added a solution in Spark 3.0.0 to solve exactly this issue, 
I don't see the reason to solve it a different way.
   * There is a configuration solution to this on broker side: `bin/kafka-acls 
--authorizer kafka.security.auth.SimpleAclAuthorizer --authorizer-properties 
zookeeper.connect=zk:2181 --add --allow-principal User:'Bon' --operation READ 
--topic topicName --group='spark-kafka-source-' --resource-pattern-type 
prefixed`
   * Only personal view but I don't see the gain adding this, on the other hand 
additional code means more maintenance.
   
   Overall I wouldn't merge it but if somebody thinks it worth then I have 
couple of further comments.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to