Re: Install via directions in "Learning Spark". Exception when running bin/pyspark

2015-10-13 Thread David Bess
Got it working!  Thank you for confirming my suspicion that this issue was
related to Java.  When I dug deeper I found multiple versions and some other
issues.  I worked on it a while before deciding it would be easier to just
uninstall all Java and reinstall clean JDK, and now it works perfectly.  



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Install-via-directions-in-Learning-Spark-Exception-when-running-bin-pyspark-tp25043p25049.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Install via directions in "Learning Spark". Exception when running bin/pyspark

2015-10-13 Thread Robineast
What you have done should work.

A couple of things to try:

1) you should have a lib directory in your Spark deployment, it should have
a jar file called lib/spark-assembly-1.5.1-hadoop2.6.0.jar. Is it there?
2) Have you set the JAVA_HOME variable to point to your java8 deployment? If
not try doing that.

Robin



-
Robin East 
Spark GraphX in Action Michael Malak and Robin East 
Manning Publications Co. 
http://www.manning.com/books/spark-graphx-in-action

--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Install-via-directions-in-Learning-Spark-Exception-when-running-bin-pyspark-tp25043p25048.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Install via directions in "Learning Spark". Exception when running bin/pyspark

2015-10-12 Thread David Bess
Greetings all,

Excited to be learning spark.  I am working through the "Learning Spark"
book and I am having trouble getting Spark installed and running.  

This is what I have done so far.  

I installed Spark from here: 

http://spark.apache.org/downloads.html

selecting 1.5.1, prebuilt for hadoop 2.6 and later, direct download.

I untared the download

cd downloads
tar -xf spark-1.5.1-bin-hadoop2.6.tgz
cd spark-1.5.1-bin-hadoop2.6

Next I try running a shell, the example in the book claims we can run in
local mode and there should be no need to install hadoop / yarn / mesos or
anything else to get started.  

I have tried the following commands

./bin/pyspark
bin/pyspark
./bin/spark-shell
bin/spark-shell

I am getting an error as follows:

Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/launcher/Main
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.Main
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)

About my system.  

I have a macbook pro OS X Yosemite 10.10.5
I just downloaded and installed the latest Java from Oracle website, I
believe this was java8u60
I double checked my python version and it appears to be 2.7.10

I am familiar with command line, and have background in hadoop, but this has
me stumped.  

Thanks in advance,

David Bess






--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Install-via-directions-in-Learning-Spark-Exception-when-running-bin-pyspark-tp25043.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org