Hi I have application which servers as ETL job and I have hundreds of such
ETL jobs which runs daily now as of now I have just one spark session which
is shared by all these jobs and sometimes all of these jobs run at the same
time causing spark session to die due memory issues mostly. Is this a good
design? I am thinking to create multiple spark sessions possibly one spark
session for each ETL job but there is delay in starting spark session which
seems to multiple by no of ETL jobs. Please share best practices and designs
for such problems. Thanks in advance.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to