In IntelliJ, nothing changed. In SBT console I got this error:
$sbt
> console
[info] Running org.apache.spark.repl.Main -usejavacp
14/04/27 08:29:44 INFO spark.HttpServer: Starting HTTP Server
14/04/27 08:29:44 INFO server.Server: jetty-7.6.8.v20121106
14/04/27 08:29:44 INFO server.AbstractCo
You'll also need:
libraryDependencies += "org.apache.spark" %% "spark-repl" %
On Sat, Apr 26, 2014 at 3:32 PM, Michael Armbrust wrote:
> This is a little bit of a hack, but might work for you. You'll need to be
> on sbt 0.13.2.
>
> connectInput in run := true
>
> outputStrategy in run := Som
This is a little bit of a hack, but might work for you. You'll need to be
on sbt 0.13.2.
connectInput in run := true
outputStrategy in run := Some (StdoutOutput)
console := {
(runMain in Compile).toTask(" org.apache.spark.repl.Main
-usejavacp").value
}
On Sat, Apr 26, 2014 at 1:05 PM, Jonat
Hi Michael, thanks for your prompt reply.
It seems like IntelliJ Scala Console actually runs the Scala REPL (they
print the same stuff when starting up).
It is probably the SBT console.
When I tried the same code in the Scala REPL of my project using "sbt
console" it didn't work either.
It only w
The spark-shell is a special version of the Scala REPL that serves the
classes created for each line over HTTP. Do you know if the InteliJ Spark
console is just the normal Scala repl in a GUI wrapper, or if it is
something else entirely? If its the former, perhaps it might be possible
to tell Int
Hi all,
TLDR: running spark locally through IntelliJ IDEA Scala Console results
in java.lang.ClassNotFoundException
Long version:
I'm an algorithms developer in SupersonicAds - an ad network. We are
building a major new big data project and we are now in the process of
selecting our tech stack &