[
https://issues.apache.org/jira/browse/SPARK-30542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17017967#comment-17017967
]
Sivakumar commented on SPARK-30542:
-----------------------------------
Hi Jungtaek,
I thought this might be a feature that should be added to structured streaming.
Also Please lemme know If you have any work around for this.
> Two Spark structured streaming jobs cannot write to same base path
> ------------------------------------------------------------------
>
> Key: SPARK-30542
> URL: https://issues.apache.org/jira/browse/SPARK-30542
> Project: Spark
> Issue Type: Bug
> Components: Structured Streaming
> Affects Versions: 2.3.0
> Reporter: Sivakumar
> Priority: Major
>
> Hi All,
> Spark Structured Streaming doesn't allow two structured streaming jobs to
> write data to the same base directory which is possible with using dstreams.
> As __spark___metadata directory will be created by default for one job,
> second job cannot use the same directory as base path as already
> _spark__metadata directory is created by other job, It is throwing exception.
> Is there any workaround for this, other than creating separate base path's
> for both the jobs.
> Is it possible to create the __spark__metadata directory else where or
> disable without any data loss.
> If I had to change the base path for both the jobs, then my whole framework
> will get impacted, So i don't want to do that.
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]