Hello Spark and User, we have a Spark project which is a long running Spark session where it does below 1. We are reading from Mapr FS and writing to MapR FS. 2. Another parallel job which reads from MapR Fs and Writes to MinIO object storage.
We are finding issues for a few batches of Spark jobs which one writes to MinIO , reads empty data frame/dataset from MapR but the job which reads from & writes to MapR Fs for the same batches never had any issue. I was just going through some blogs and stackoverflow to know that Spark Session which holds both information /config of MapR and Minio sometimes find this issue as Spark Session or context has no correct information so either we need to clear or restart spark session for each batch. Please let me know if you have any suggestions to get rid of this issue.