Umesh,

I found the following write-up dealing with architecture and memory
considerations elaborately. There are updates on memory, but it would
be a good start for you:
https://0x0fff.com/spark-architecture/

Any additional source(s) of info. are welcome from others too.

- Venkat.

On Sun, Sep 16, 2018 at 11:45 PM unk1102 <umesh.ka...@gmail.com> wrote:

> Hi I have application which servers as ETL job and I have hundreds of such
> ETL jobs which runs daily now as of now I have just one spark session which
> is shared by all these jobs and sometimes all of these jobs run at the same
> time causing spark session to die due memory issues mostly. Is this a good
> design? I am thinking to create multiple spark sessions possibly one spark
> session for each ETL job but there is delay in starting spark session which
> seems to multiple by no of ETL jobs. Please share best practices and
> designs
> for such problems. Thanks in advance.
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to