I think we should try the Chrystal ball to answer this question.
Dr Mich Talebzadeh
*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.
On 21 September 2016 at 13:14, Jörn Franke <jornfra...@gmail.com> wrote:
> Do you mind sharing what your software does? What is the input data size?
> What is the spark version and apis used? How many nodes? What is the input
> data format? Is compression used?
> On 21 Sep 2016, at 13:37, Trinadh Kaja <ktr.hadoo...@gmail.com> wrote:
> Hi all,
> how to increase spark performance ,i am using pyspark.
> cluster info :
> Total memory :600gb
> Cores :96
> command :
> spark-submit --master yarn-client --executor-memory 10G --num-executors
> 50 --executor-cores 2 --driver-memory 10g --queue thequeue
> please help on this