Hi ,

Is there any code to implement a kafka output for spark streaming? My use
case is all the output need to be dumped back to kafka cluster again after
data is processed ?  What will be guideline to implement such function ? I
heard foreachRDD will create one instance of producer per batch ? If so,
will that hurt performance ?

Thanks,

Weide

Reply via email to