Re: does spark needs dedicated machines to run on

2016-03-10 Thread Ted Yu
bq. Started SparkUI at http://192.168.2.103:4040
bq. Initial job has not accepted any resources; check your cluster UI to
ensure that workers are registered and have sufficient resources

Can you check UI ?

Thanks

On Thu, Mar 10, 2016 at 6:57 AM, Shams ul Haque  wrote:

> Hi,
>
> *Release of Spark:* 1.6.0, i downloaded it and made a built using
> 'sbt/sbt assembly'
>
> *command for submitting your app: *bin/spark-submit --master
> spark://shams-machine:7077 --executor-cores 2 --class
> in.myapp.email.combiner.CombinerRealtime
> /opt/dev/workspace-luna/combiner_spark/target/combiner-0.0.1-SNAPSHOT.jar
> 2>&1 &
>
> *code snippet of your app: *i developed a lot chained transormations and
> connected with Kafka, MongoDB, Cassandra. But tested all of them using 
> *local[2]
> *setting in *conf.setMaster *method. Everything is working there.
>
> *pastebin of log:* http://pastebin.com/0LjTWLfm
>
>
> Thanks
> Shams
>
> On Thu, Mar 10, 2016 at 8:11 PM, Ted Yu  wrote:
>
>> Can you provide a bit more information ?
>>
>> Release of Spark
>> command for submitting your app
>> code snippet of your app
>> pastebin of log
>>
>> Thanks
>>
>> On Thu, Mar 10, 2016 at 6:32 AM, Shams ul Haque 
>> wrote:
>>
>>> Hi,
>>>
>>> I have developed a spark realtime app and started spark-standalone on my
>>> laptop. But when i tried to submit that app in Spark it is always
>>> in WAITING state & Cores is always Zero.
>>>
>>> I have set:
>>> export SPARK_WORKER_CORES="2"
>>> export SPARK_EXECUTOR_CORES="1"
>>>
>>> in spark-env.sh, but still nothing happend. And same log entry in:
>>> *TaskSchedulerImpl:70 - Initial job has not accepted any resources*
>>>
>>> So, does i need a seperate machine for all this?
>>>
>>> Please help me to sort that out.
>>>
>>> Thanks
>>> Shams
>>>
>>
>>
>


Re: does spark needs dedicated machines to run on

2016-03-10 Thread Shams ul Haque
Hi,

*Release of Spark:* 1.6.0, i downloaded it and made a built using 'sbt/sbt
assembly'

*command for submitting your app: *bin/spark-submit --master
spark://shams-machine:7077 --executor-cores 2 --class
in.myapp.email.combiner.CombinerRealtime
/opt/dev/workspace-luna/combiner_spark/target/combiner-0.0.1-SNAPSHOT.jar
2>&1 &

*code snippet of your app: *i developed a lot chained transormations and
connected with Kafka, MongoDB, Cassandra. But tested all of them using
*local[2]
*setting in *conf.setMaster *method. Everything is working there.

*pastebin of log:* http://pastebin.com/0LjTWLfm


Thanks
Shams

On Thu, Mar 10, 2016 at 8:11 PM, Ted Yu  wrote:

> Can you provide a bit more information ?
>
> Release of Spark
> command for submitting your app
> code snippet of your app
> pastebin of log
>
> Thanks
>
> On Thu, Mar 10, 2016 at 6:32 AM, Shams ul Haque 
> wrote:
>
>> Hi,
>>
>> I have developed a spark realtime app and started spark-standalone on my
>> laptop. But when i tried to submit that app in Spark it is always
>> in WAITING state & Cores is always Zero.
>>
>> I have set:
>> export SPARK_WORKER_CORES="2"
>> export SPARK_EXECUTOR_CORES="1"
>>
>> in spark-env.sh, but still nothing happend. And same log entry in:
>> *TaskSchedulerImpl:70 - Initial job has not accepted any resources*
>>
>> So, does i need a seperate machine for all this?
>>
>> Please help me to sort that out.
>>
>> Thanks
>> Shams
>>
>
>


Re: does spark needs dedicated machines to run on

2016-03-10 Thread Ted Yu
Can you provide a bit more information ?

Release of Spark
command for submitting your app
code snippet of your app
pastebin of log

Thanks

On Thu, Mar 10, 2016 at 6:32 AM, Shams ul Haque  wrote:

> Hi,
>
> I have developed a spark realtime app and started spark-standalone on my
> laptop. But when i tried to submit that app in Spark it is always
> in WAITING state & Cores is always Zero.
>
> I have set:
> export SPARK_WORKER_CORES="2"
> export SPARK_EXECUTOR_CORES="1"
>
> in spark-env.sh, but still nothing happend. And same log entry in:
> *TaskSchedulerImpl:70 - Initial job has not accepted any resources*
>
> So, does i need a seperate machine for all this?
>
> Please help me to sort that out.
>
> Thanks
> Shams
>


does spark needs dedicated machines to run on

2016-03-10 Thread Shams ul Haque
Hi,

I have developed a spark realtime app and started spark-standalone on my
laptop. But when i tried to submit that app in Spark it is always
in WAITING state & Cores is always Zero.

I have set:
export SPARK_WORKER_CORES="2"
export SPARK_EXECUTOR_CORES="1"

in spark-env.sh, but still nothing happend. And same log entry in:
*TaskSchedulerImpl:70 - Initial job has not accepted any resources*

So, does i need a seperate machine for all this?

Please help me to sort that out.

Thanks
Shams