Thanks guys. I do have HADOOP_INSTALL set, but Spark 1.4.1 did not seem to mind. Seems like there's a difference in behavior between 1.5.0 and 1.4.1 for some reason.
To the best of my knowledge, I just downloaded each tgz and untarred them in /opt I adjusted my PATH to point to one or the other, but that should be about it. Does 1.5.0 pick up HADOOP_INSTALL? Wouldn't spark-shell --master local override that? 1.5 seemed to completely ignore --master local ----- -- Madhu https://www.linkedin.com/in/msiddalingaiah -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/spark-shell-1-5-doesn-t-seem-to-work-in-local-mode-tp14212p14217.html Sent from the Apache Spark Developers List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org