Hi,
can you please give more details around this? What is the requirement? What
is the SPARK version you are using? What do you mean by multiple sources?
What are these sources?
Regards,
Gourav Sengupta
On Wed, Aug 25, 2021 at 3:51 AM Artemis User wrote:
> Thanks Daniel. I guess you were su
No, that applies to the streaming DataFrame API too.
No jobs can't communicate with each other.
On Tue, Aug 24, 2021 at 9:51 PM Artemis User wrote:
> Thanks Daniel. I guess you were suggesting using DStream/RDD. Would it
> be possible to use structured streaming/DataFrames for multi-source
> s
Thanks Daniel. I guess you were suggesting using DStream/RDD. Would it
be possible to use structured streaming/DataFrames for multi-source
streaming? In addition, we really need each stream data ingestion to be
asynchronous or non-blocking... thanks!
On 8/24/21 9:27 PM, daniel williams wrot
Is there a way to run multiple streams in a single Spark job using
Structured Streaming? If not, is there an easy way to perform inter-job
communications (e.g. referencing a dataframe among concurrent jobs) in
Spark? Thanks a lot in advance!
-- ND
---
Hi,
I received a response from AWS, this is an issue with EMR, and they are
working on resolving the issue I believe.
Thanks and Regards,
Gourav Sengupta
On Mon, Aug 23, 2021 at 1:35 PM Gourav Sengupta <
gourav.sengupta.develo...@gmail.com> wrote:
> Hi,
>
> the query still gives the same error