You'll also need:

libraryDependencies += "org.apache.spark" %% "spark-repl" % <spark version>



On Sat, Apr 26, 2014 at 3:32 PM, Michael Armbrust <mich...@databricks.com>wrote:

> This is a little bit of a hack, but might work for you.  You'll need to be
> on sbt 0.13.2.
>
> connectInput in run := true
>
> outputStrategy in run := Some (StdoutOutput)
>
> console := {
>   (runMain in Compile).toTask(" org.apache.spark.repl.Main
> -usejavacp").value
> }
>
>
> On Sat, Apr 26, 2014 at 1:05 PM, Jonathan Chayat <
> jonatha...@supersonicads.com> wrote:
>
>> Hi Michael, thanks for your prompt reply.
>>
>> It seems like IntelliJ Scala Console actually runs the Scala REPL (they
>> print the same stuff when starting up).
>> It is probably the SBT console.
>>
>> When I tried the same code in the Scala REPL of my project using "sbt
>> console" it didn't work either.
>> It only worked in spark project's bin/spark-shell
>>
>> Is there a way to customize the SBT console of a project listing spark as
>> a dependency?
>>
>> Thx,
>>     Jon
>>
>>
>> On Sat, Apr 26, 2014 at 9:42 PM, Michael Armbrust <mich...@databricks.com
>> > wrote:
>>
>>> The spark-shell is a special version of the Scala REPL that serves the
>>> classes created for each line over HTTP.  Do you know if the InteliJ Spark
>>> console is just the normal Scala repl in a GUI wrapper, or if it is
>>> something else entirely?  If its the former, perhaps it might be possible
>>> to tell InteliJ to bring up the spark version instead.
>>>
>>>
>>> On Sat, Apr 26, 2014 at 10:47 AM, Jonathan Chayat <
>>> jonatha...@supersonicads.com> wrote:
>>>
>>>> Hi all,
>>>>
>>>> TLDR: running spark locally through IntelliJ IDEA Scala Console results
>>>> in java.lang.ClassNotFoundException
>>>>
>>>> Long version:
>>>>
>>>> I'm an algorithms developer in SupersonicAds - an ad network. We are
>>>> building a major new big data project and we are now in the process of
>>>> selecting our tech stack & tools.
>>>>
>>>> I'm new to Spark, but I'm very excited about it. It is my opinion that
>>>> Spark can be a great tool for us, and that we might be able to build most
>>>> of our toolchain on top of it.
>>>>
>>>> We currently develop in Scala and we are using IntelliJ IDEA as our IDE
>>>> (we love it). One of the features I love about IDEA is the Scala Console
>>>> which lets me work interactively with all of my project's code available
>>>> and all of the IDE's features & convenience. That is as opposed to the
>>>> Scala Shell & Spark Shell which I dislike because it is based on JLine and
>>>> doesn't behave like a good shell would (I cant even Ctrl-c to abort a line
>>>> without crashing the whole thing). Of course, as an algo guy, having a good
>>>> REPL is crucial to me.
>>>>
>>>> To get started, I added the following line to build.sbt:
>>>>
>>>>> "org.apache.spark" %% "spark-core" % "0.9.1"
>>>>
>>>>
>>>> Then, added the following main class:
>>>>
>>>> import org.apache.spark.SparkContext
>>>>> import org.apache.spark.SparkContext._
>>>>>
>>>>
>>>>
>>>> object Main extends App {
>>>>>   val sc = new SparkContext("local", "myApp")
>>>>>   val r = sc.parallelize(1 to 1000)
>>>>>   println("r.filter(_ % 2 == 1).first() = " + r.filter(_ % 2 ==
>>>>> 1).first())
>>>>>   println("r.filter(_ % 2 == 1).count() = " + r.filter(_ % 2 ==
>>>>> 1).count())
>>>>> }
>>>>
>>>>
>>>> Make, Run, Works perfectly.
>>>>
>>>> Next, I try running the same in the scala console.
>>>> Bad news - the last line throws an exception:
>>>>
>>>>> ERROR executor.Executor: Exception in task ID 0
>>>>> java.lang.ClassNotFoundException:
>>>>> $line5.$read$$iw$$iw$$iw$$iw$$anonfun$2
>>>>
>>>>
>>>> It is my guess that for some reason Spark is not able to find the
>>>> anonymous function (_ % 2 == 1). Please note I'm running locally so I did
>>>> not provide any jars. For some reason when using first() instead of count()
>>>> it works. Needless to say it also works in Spark Shell but as I stated,
>>>> working with it is not an option.
>>>>
>>>> This issue brings much sadness to my heart, and I could not find a
>>>> solution on the mailing list archives or elsewhere. I am hoping someone
>>>> here might offer some help.
>>>>
>>>> Thanks,
>>>>     Jon
>>>>
>>>
>>>
>>
>

Reply via email to