Is there a plan to have this in pyspark in dome later release?

On Wed, 25 Jan 2017 at 10:01 am, Koert Kuipers <ko...@tresata.com> wrote:

> i implemented a sink using foreach it was indeed straightforward thanks
>
> On Fri, Jan 13, 2017 at 6:30 PM, Tathagata Das <
> tathagata.das1...@gmail.com> wrote:
>
> Structured Streaming has a foreach sink, where you can essentially do what
> you want with your data. Its easy to create a Kafka producer, and write the
> data out to kafka.
>
> http://spark.apache.org/docs/latest/structured-streaming-programming-guide.html#using-foreach
>
> On Fri, Jan 13, 2017 at 8:28 AM, Koert Kuipers <ko...@tresata.com> wrote:
>
> how do you do this with structured streaming? i see no mention of writing
> to kafka
>
> On Fri, Jan 13, 2017 at 10:30 AM, Peyman Mohajerian <mohaj...@gmail.com>
> wrote:
>
> Yes, it is called Structured Streaming:
> https://docs.databricks.com/_static/notebooks/structured-streaming-kafka.html
>
> http://spark.apache.org/docs/latest/structured-streaming-programming-guide.html
>
> On Fri, Jan 13, 2017 at 3:32 AM, Senthil Kumar <senthilec...@gmail.com>
> wrote:
>
> Hi Team ,
>
>      Sorry if this question already asked in this forum..
>
> Can we ingest data to Apache Kafka Topic from Spark SQL DataFrame ??
>
> Here is my Code which Reads Parquet File :
>
> *val sqlContext = new org.apache.spark.sql.SQLContext(sc);*
>
> *val df = sqlContext.read.parquet("..../temp/*.parquet")*
>
> *df.registerTempTable("beacons")*
>
>
> I want to directly ingest df DataFrame to Kafka ! Is there any way to
> achieve this ??
>
>
> Cheers,
>
> Senthil
>
>
>
>
>
> --
Best Regards,
Ayan Guha

Reply via email to