This is exactly what I got 
Spark 
Executor Command: "java" "-cp" "::
/usr/local/spark-1.0.0/conf:
/usr/local/spark-1.0.0
/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar:/usr/local/hadoop/conf"
 "
-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M" 
"org.apache.spark.executor.CoarseGrainedExecutorBackend
" "akka.tcp://spark@master:54477/user/CoarseGrainedScheduler" "0" "slave2" "1
" "akka.tcp://sparkWorker@slave2:41483/user/Worker" "app-20140704174955-0002"
======================================== 14/07/04 17:50:14 ERROR 
CoarseGrainedExecutorBackend: 
Driver Disassociated [akka.tcp://sparkExecutor@slave2:33758] -> 
[akka.tcp://spark@master:54477] disassociated! Shutting down.
 I have posted in stackoverflow and I revived this answer  
http://stackoverflow.com/questions/24571922/apache-spark-stderr-and-stdout/24594576#24594576

I am not sure whether  I need to set a ip address to driver ? do I need a 
separate machine for driver ?

Best Regards 

....................................................... 

Amin Mohebbi 

PhD candidate in Software Engineering  
 at university of Malaysia   

H/P : +60 18 2040 017 



E-Mail : tp025...@ex.apiit.edu.my 

              amin_...@me.com


On Wednesday, July 9, 2014 2:39 PM, Akhil Das <ak...@sigmoidanalytics.com> 
wrote:
 


Can you also paste a little bit more stacktrace?


Thanks
Best Regards


On Wed, Jul 9, 2014 at 12:05 PM, amin mohebbi <aminn_...@yahoo.com> wrote:

I have the following in spark-env.sh 
>
>
>SPARK_MASTER_IP=master
>
>SPARK_MASTER_port=7077
>
> 
>
>Best Regards 
>
>....................................................... 
>
>Amin Mohebbi 
>
>PhD candidate in Software Engineering  
> at university of Malaysia   
>
>H&#x2F;P : +60 18 2040 017 
>
>
>
>E-Mail : tp025...@ex.apiit.edu.my 
>
>              amin_...@me.com
>
>
>
>On Wednesday, July 9, 2014 2:32 PM, Akhil Das <ak...@sigmoidanalytics.com> 
>wrote:
> 
>
>
>Can you try setting SPARK_MASTER_IP in the spark-env.sh file?
>
>
>Thanks
>Best Regards
>
>
>On Wed, Jul 9, 2014 at 10:58 AM, amin mohebbi <aminn_...@yahoo.com> wrote:
>
>
>>
>> Hi all, 
>>I have one master and two slave node, I did not set any ip for spark driver 
>>because I thought it uses its default ( localhost). In my etc/hosts I got the 
>>following : 192.168.0.1 master, 192.168.0.2 slave, 192.168.03 slave2 
>>127.0.0.0 local host and 127.0.1.1 virtualbox . Should I do something in 
>>hosts? or should I set a ip to spark-local-ip? I got the following in my 
>>stderr: 
>>
>>
>>Spark 
Executor Command: "java" "-cp" "::
/usr/local/spark-1.0.0/conf:
/usr/local/spark-1.0.0
/assembly/target/scala-2.10/spark-assembly-1.0.0-hadoop1.2.1.jar:/usr/local/hadoop/conf"
 "
-XX:MaxPermSize=128m" "-Xms512M" "-Xmx512M" 
"org.apache.spark.executor.CoarseGrainedExecutorBackend
" "akka.tcp://spark@master:54477/user/CoarseGrainedScheduler" "0" "slave2" "1
" "akka.tcp://sparkWorker@slave2:41483/user/Worker" "app-20140704174955-0002"
======================================== 14/07/04 17:50:14 ERROR 
CoarseGrainedExecutorBackend: 
Driver Disassociated [akka.tcp://sparkExecutor@slave2:33758] -> 
[akka.tcp://spark@master:54477] disassociated! Shutting down.
>> 
>>
>>
>> 
>>
>>Best Regards 
>>
>>....................................................... 
>>
>>Amin Mohebbi 
>>
>>PhD candidate in Software Engineering  
>> at university of Malaysia   
>>
>>H&#x2F;P : +60 18 2040 017 
>>
>>
>>
>>E-Mail : tp025...@ex.apiit.edu.my 
>>
>>              amin_...@me.com
>
>
>

Reply via email to