Found this issue reported earlier but was bulk closed:
https://issues.apache.org/jira/browse/SPARK-27030
Regards,
Shrikant
On Fri, 22 Sep 2023 at 12:03 AM, Shrikant Prasad
wrote:
> Hi all,
>
> We have multiple spark jobs running in parallel trying to write into same
> hive table but each job
Hi all,
We have multiple spark jobs running in parallel trying to write into same
hive table but each job writing into different partition. This was working
fine with Spark 2.3 and Hadoop 2.7.
But after upgrading to Spark 3.2 and Hadoop 3.2.2, these parallel jobs are
failing with FileNotFound