2:38 AM
> To: Tathagata Das , "ymaha...@snappydata.io"
> , "priy...@asperasoft.com" ,
> "user @spark"
> Subject: Re: [Structured Streaming] Avoiding multiple streaming queries
>
> Hi,
> Did anyone of you thought about writing a custom foreach sink
om>, Tathagata Das , "
> ymaha...@snappydata.io" , "priy...@asperasoft.com"
> , "user @spark"
>
> *Subject: *Re: [Structured Streaming] Avoiding multiple streaming queries
>
>
>
> @Silvio Thought about duplicating rows but dropped the idea for i
,
"ymaha...@snappydata.io" , "priy...@asperasoft.com"
, "user @spark"
Subject: Re: [Structured Streaming] Avoiding multiple streaming queries
@Silvio Thought about duplicating rows but dropped the idea for increasing
memory. forEachBatch sounds Interesting!
O
...@snappydata.io" <
> ymaha...@snappydata.io>, "priy...@asperasoft.com" ,
> "user @spark"
>
> *Subject: *Re: [Structured Streaming] Avoiding multiple streaming queries
>
>
>
> understand each row has a topic column but can we write one
@asperasoft.com" ,
"user @spark"
Subject: Re: [Structured Streaming] Avoiding multiple streaming queries
understand each row has a topic column but can we write one row to multiple
topics?
On Thu, Jul 12, 2018 at 11:00 AM, Arun Mahadevan
mailto:ar...@apache.org>> w
sinks like Kafka and
>> you need to write the custom logic yourself and you cannot scale the
>> partitions for the sinks independently.
>>
>> [1] https://spark.apache.org/docs/2.1.2/api/java/org/apache/spark/sql/
>> ForeachWriter.html
>>
>> From: chandan prakash
>
n Iyer
Cc: Tathagata Das , "ymaha...@snappydata.io"
, "priy...@asperasoft.com" ,
"user @spark"
Subject: Re: [Structured Streaming] Avoiding multiple streaming queries
Thanks a lot Arun for your response.
I got your point that existing sink plugins like kafka, etc
> Date: Thursday, July 12, 2018 at 2:38 AM
> To: Tathagata Das , "ymaha...@snappydata.io"
> , "priy...@asperasoft.com" ,
> "user @spark"
> Subject: Re: [Structured Streaming] Avoiding multiple streaming queries
>
> Hi,
> Did anyone of you though
uot;user @spark"
Subject: Re: [Structured Streaming] Avoiding multiple streaming queries
Hi,
Did anyone of you thought about writing a custom foreach sink writer which can
decided which record should go to which sink (based on some marker in record,
which we can possibly annotate d
Hi,
Did anyone of you thought about writing a custom foreach sink writer which
can decided which record should go to which sink (based on some marker in
record, which we can possibly annotate during transformation) and then
accordingly write to specific sink.
This will mean that:
1. every custom s
Of course, you can write to multiple Kafka topics from a single query. If
your dataframe that you want to write has a column named "topic" (along
with "key", and "value" columns), it will write the contents of a row to
the topic in that row. This automatically works. So the only thing you need
to f
I had a similar issue and i think that’s where the structured streaming
design lacks.
Seems like Question#2 in your email is a viable workaround for you.
In my case, I have a custom Sink backed by an efficient in-memory column
store suited for fast ingestion.
I have a Kafka stream coming from one
Hi Priyank
I have a similar structure, although I am reading from Kafka and sinking to
multiple MySQL tables. My input stream has multiple message types and each
is headed for a different MySQL table.
I've looked for a solution for a few months, and have only come up with two
alternatives:
1. Si
I have a structured streaming query which sinks to Kafka. This query has a
complex aggregation logic.
I would like to sink the output DF of this query to multiple Kafka topics
each partitioned on a different ‘key’ column. I don’t want to have
multiple Kafka sinks for each of the different Kafka
14 matches
Mail list logo