Original Message-
> From: elyast [mailto:lukasz.jastrzeb...@gmail.com]
> Sent: Friday, March 07, 2014 20:01
> To: u...@spark.incubator.apache.org
> Subject: Re: major Spark performance problem
>
> Hi,
>
> There is also an option to run spark applications on top of
...@spark.incubator.apache.org
Subject: Re: major Spark performance problem
Hi,
There is also an option to run spark applications on top of mesos in fine
grained mode, then it is possible for fair scheduling (applications will run in
parallel and mesos is responsible for scheduling all tasks) so in
may not be
faster however the benefit is the fair scheduling (small jobs will not be
stuck by the big ones).
Best regards
Lukasz Jastrzebski
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/major-Spark-performance-problem-tp2364p2403.html
Sent from the
Dana,
When you run multiple "applications" under Spark, and if each application
takes up the entire cluster resources, it is expected that one will block
the other completely, thus you're seeing that the wall time add together
sequentially. In addition there is some overhead associated with starti
Hi all,
We have a big issue and would like if someone have any insights or ideas.
The problem is composed of two connected problems.
1. Run time of a single application.
2. Run time of multiple applications in parallel is almost linear with
run time of a single application.
We have