Hi,

I have configured Spark to run on YARN. Whenever I start spark shell using
'spark-shell' command, it automatically gets killed. Output looks like
below:

ubuntu@dev-cluster-gateway:~$ ls shekhar/
edx-spark
ubuntu@dev-cluster-gateway:~$ spark-shell
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.2.0-SNAPSHOT
      /_/

Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
Type in expressions to have them evaluated.
Type :help for more information.
15/06/10 05:20:45 WARN Utils: Your hostname, dev-cluster-gateway resolves
to a loopback address: 127.0.0.1; using 10.182.149.171 instead (on
interface eth0)
15/06/10 05:20:45 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
15/06/10 05:21:20 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
/usr/lib/spark/bin/spark-shell: line 48: 15573 Killed
 "$FWDIR"/bin/spark-submit --class org.apache.spark.repl.Main
"${SUBMISSION_OPTS[@]}" spark-shell "${APPLICATION_OPTS[@]}"


Any clue why spark shell gets killed? Please let me know if other
configuration/information is required.

Thanks,
Chandrash3khar Kotekar

Reply via email to