Github user marmbrus commented on the issue:

    https://github.com/apache/spark/pull/15483
  
    Thanks for working on this!  However, I'm not sure that this is something 
that we should merge into the core repository (Though I think its an awesome 
example of how to use the `ForeachWriter` interface!  You should consider 
making a blog post or even a spark package).  Specifically, when we add support 
for writing to kafka, I think we'll want to do it both through the DataSource 
API, as well as the `Sink` API.  This would let users seamlessly write batch 
and streaming jobs to Kafka in java, scala, python, R and SQL using the same 
code.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to