Hi,

I created 2 workers on same machine each with 4 cores and 6GB ram

I submitted first job, and it allocated 2 cores on each of the worker
processes, and utilized full 4 GB ram for each executor process

When i submit my second job it always say in WAITING state.


Cheers!!



On Tue, Oct 20, 2015 at 10:46 AM, Tathagata Das <t...@databricks.com> wrote:

> You can set the max cores for the first submitted job such that it does
> not take all the resources from the master. See
> http://spark.apache.org/docs/latest/submitting-applications.html
>
> # Run on a Spark standalone cluster in client deploy mode
> ./bin/spark-submit \
>   --class org.apache.spark.examples.SparkPi \
>   --master spark://207.184.161.138:7077 \
>   --executor-memory 20G \
>   *--total-executor-cores 100 \*
>   /path/to/examples.jar \
>   1000
>
>
> On Mon, Oct 19, 2015 at 4:26 PM, Augustus Hong <augus...@branchmetrics.io>
> wrote:
>
>> Hi All,
>>
>> Would it be possible to run multiple spark streaming jobs on a single
>> master at the same time?
>>
>> I currently have one master node and several worker nodes in the
>> standalone mode, and I used spark-submit to submit multiple spark streaming
>> jobs.
>>
>> From what I observed, it seems like only the first submitted job would
>> get resources and run.  Jobs submitted afterwards will have the status
>> "Waiting", and will only run after the first one is finished or killed.
>>
>> I tried limiting each executor to only 1 core(each worker machine has 8
>> cores), but the same things happens that only one job will be run, even
>> though there are a lot of idle cores.
>>
>> Best,
>> Augustus
>>
>>
>>
>> --
>> [image: Branch Metrics mobile deep linking] <http://branch.io/>* Augustus
>> Hong*
>>  Data Analytics | Branch Metrics
>>  m 650-391-3369 | e augus...@branch.io
>>
>
>

Reply via email to