Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Aakash Basu
e documentation, joining >>> an aggregated streaming data frame with another streaming data frame is not >>> supported >>> >>> >>> >>> >>> >>> *From: *spark receiver <spark.recei...@gmail.com> >>> *Date: *Fr

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Gerard Maas
; *Date: *Friday, April 13, 2018 at 11:49 PM >> *To: *Aakash Basu <aakash.spark@gmail.com> >> *Cc: *Panagiotis Garefalakis <panga...@gmail.com>, user < >> user@spark.apache.org> >> *Subject: *Re: [Structured Streaming] More than 1 streaming in a code &g

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Lalwani, Jayesh
panga...@gmail.com>, user <user@spark.apache.org> Subject: Re: [Structured Streaming] More than 1 streaming in a code If I use timestamp based windowing, then my average will not be global average but grouped by timestamp, which is not my requirement. I want to recalculate the avg of enti

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Aakash Basu
...@capitalone.com> > *Cc: *spark receiver <spark.recei...@gmail.com>, Panagiotis Garefalakis < > panga...@gmail.com>, user <user@spark.apache.org> > > *Subject: *Re: [Structured Streaming] More than 1 streaming in a code > > > > Hey Jayesh and Others,

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Lalwani, Jayesh
ark@gmail.com> Date: Monday, April 16, 2018 at 4:52 AM To: "Lalwani, Jayesh" <jayesh.lalw...@capitalone.com> Cc: spark receiver <spark.recei...@gmail.com>, Panagiotis Garefalakis <panga...@gmail.com>, user <user@spark.apache.org> Subject: Re: [Structured St

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-16 Thread Aakash Basu
t; *To: *Aakash Basu <aakash.spark@gmail.com> > *Cc: *Panagiotis Garefalakis <panga...@gmail.com>, user < > user@spark.apache.org> > *Subject: *Re: [Structured Streaming] More than 1 streaming in a code > > > > Hi Panagiotis , > > > > Wonder

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-15 Thread Lalwani, Jayesh
Friday, April 13, 2018 at 11:49 PM To: Aakash Basu <aakash.spark@gmail.com> Cc: Panagiotis Garefalakis <panga...@gmail.com>, user <user@spark.apache.org> Subject: Re: [Structured Streaming] More than 1 streaming in a code Hi Panagiotis , Wondering you solved the problem or

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-13 Thread spark receiver
he > second query will not even start. > What you could do instead is remove all the blocking calls and use > spark.streams.awaitAnyTermination instead (waiting for either query1 or > query2 to terminate). Make sure you do that after the query2.start call. > > I hope this helps.

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-06 Thread Aakash Basu
gt; wrote: > >> Any help? >> >> Need urgent help. Someone please clarify the doubt? >> >> -- Forwarded message -- >> From: Aakash Basu <aakash.spark@gmail.com> >> Date: Thu, Apr 5, 2018 at 3:18 PM >> Subject: [Structured St

Re: [Structured Streaming] More than 1 streaming in a code

2018-04-06 Thread Panagiotis Garefalakis
please clarify the doubt? > > -- Forwarded message -- > From: Aakash Basu <aakash.spark@gmail.com> > Date: Thu, Apr 5, 2018 at 3:18 PM > Subject: [Structured Streaming] More than 1 streaming in a code > To: user <user@spark.apache.org> > > > Hi, &g

Fwd: [Structured Streaming] More than 1 streaming in a code

2018-04-06 Thread Aakash Basu
Any help? Need urgent help. Someone please clarify the doubt? -- Forwarded message -- From: Aakash Basu <aakash.spark@gmail.com> Date: Thu, Apr 5, 2018 at 3:18 PM Subject: [Structured Streaming] More than 1 streaming in a code To: user <user@spark.apache.org> Hi

[Structured Streaming] More than 1 streaming in a code

2018-04-05 Thread Aakash Basu
Hi, If I have more than one writeStream in a code, which operates on the same readStream data, why does it produce only the first writeStream? I want the second one to be also printed on the console. How to do that? from pyspark.sql import SparkSession from pyspark.sql.functions import split,