Re: Running spark examples in Intellij

2017-10-11 Thread Stephen Boesch
Thinking more carefully on your comment: - There may be some ambiguity as to whether the repo provided libraries are actually being used here - as you indicate - instead of the in-project classes. That would have to do with how the classpath inside IJ were constructed. When I click t

Re: Running spark examples in Intellij

2017-10-11 Thread Stephen Boesch
A clarification here: the example is being run *from the Spark codebase*. Therefore the mvn install step would not be required as the classes are available directly within the project. The reason for needing the `mvn package` to be invoked is to pick up the changes of having updated the spark depe

Re: Running spark examples in Intellij

2017-10-11 Thread Paul
You say you did the maven package but did you do a maven install and define your local maven repo in SBT? -Paul Sent from my iPhone > On Oct 11, 2017, at 5:48 PM, Stephen Boesch wrote: > > When attempting to run any example program w/ Intellij I am running into > guava versioning issues: >

Running spark examples in Intellij

2017-10-11 Thread Stephen Boesch
When attempting to run any example program w/ Intellij I am running into guava versioning issues: Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/cache/CacheLoader at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73) at org.apache.spark.SparkConf.

Re: Running spark examples/scala scripts

2014-03-19 Thread Mayur Rustagi
You have to pick the right client version for your Hadoop. So basically its going to be your hadoop version. Map of hadoop versions to cdh & hortonworks is given on spark website. Regards Mayur Mayur Rustagi Ph: +1 (760) 203 3257 http://www.sigmoidanalytics.com @mayur_rustagi

Re: Running spark examples/scala scripts

2014-03-18 Thread Pariksheet Barapatre
:-) Thanks for suggestion. I was actually asking how to run spark scripts as a standalone App. I am able to run Java code and Python code as standalone app. one more doubt, documentation says - to read HDFS file, we need to add dependency org.apache.hadoop hadoop-client 1.0.1 How to know HDFS

Re: Running spark examples/scala scripts

2014-03-18 Thread Mayur Rustagi
print out the last line & run it outside on the shell :) Mayur Rustagi Ph: +1 (760) 203 3257 http://www.sigmoidanalytics.com @mayur_rustagi On Tue, Mar 18, 2014 at 2:37 AM, Pariksheet Barapatre wrote: > Hello all, > > I am trying to run shipped in example wi

Running spark examples/scala scripts

2014-03-17 Thread Pariksheet Barapatre
Hello all, I am trying to run shipped in example with spark i.e. in example directory. [cloudera@aster2 examples]$ ls bagel ExceptionHandlingTest.scala HdfsTest2.scala LocalKMeans.scala MultiBroadcastTest.scala SparkHdfsLR.scala SparkPi.scala BroadcastTest.scala

Re: Running spark examples

2014-03-17 Thread Chengi Liu
Hi, Thanks for the quick response.. Is there a simple way to write and deploy apps on spark. import org.apache.spark.SparkContext; import org.apache.spark.SparkContext._; object HelloWorld { def main(args: Array[String]) { println("Hello, world!") val sc = new SparkContext("local

Re: Running spark examples

2014-03-17 Thread Matei Zaharia
Look at the “running the examples” section of http://spark.incubator.apache.org/docs/latest/index.html, there’s a script to do it. On Mar 17, 2014, at 9:55 AM, Chengi Liu wrote: > Hi, > I compiled the spark examples and I see that there are couple of jars > spark-examples_2.10-0.9.0-incubat

Running spark examples

2014-03-17 Thread Chengi Liu
Hi, I compiled the spark examples and I see that there are couple of jars spark-examples_2.10-0.9.0-incubating-sources.jar spark-examples_2.10-0.9.0-incubating.jar If I want to run an example using these jars, which one should I run and how do i run them? Thanks