Re: Mesos Spark Fine Grained Execution - CPU count

2016-12-24 Thread Davies Liu
Using 0 for spark.mesos.mesosExecutor.cores is better than dynamic allocation, but have to pay a little more overhead for launching a task, which should be OK if the task is not trivial. Since the direct result (up to 1M by default) will also go through mesos, it's better to tune it lower,

Re: How many Spark streaming applications can be run at a time on a Spark cluster?

2016-12-24 Thread Dirceu Semighini Filho
Hi, You can start multiple spark apps per cluster. You will have one stream context per app. Em 24 de dez de 2016 18:22, "shyla deshpande" escreveu: > Hi All, > > Thank you for the response. > > As per > > https://docs.cloud.databricks.com/docs/latest/databricks_ >

Re: How many Spark streaming applications can be run at a time on a Spark cluster?

2016-12-24 Thread shyla deshpande
Hi All, Thank you for the response. As per https://docs.cloud.databricks.com/docs/latest/databricks_guide/index.html#07%20Spark%20Streaming/15%20Streaming%20FAQs.html There can be only one streaming context in a cluster which implies only one streaming job. So, I am still confused. Anyone

Re: 答复: submit spark task on yarn asynchronously via java?

2016-12-24 Thread Naveen
Hi, Please use SparkLauncher API class and invoke the threads using async calls using Futures. Using SparkLauncher, you can mention class name, application resouce, arguments to be passed to the driver, deploy-mode etc. I would suggest to use scala's Future, is scala code is possible.

Re: Launching multiple spark jobs within a main spark job.

2016-12-24 Thread Naveen
Thanks Liang, Vadim and everyone for your inputs!! With this clarity, I've tried client modes for both main and sub-spark jobs. Every main spark job and its corresponding threaded spark jobs are coming up on the YARN applications list and the jobs are getting executed properly. I need to now test