This error message says "I can't find the config for the akka subsystem".
That is typically included in the Spark assembly.
First, you need to compile your spark distro, by running sbt/sbt assembly
on the SPARK_HOME dir.
Then, use the SPARK_HOME (through env or configuration) to point to your
SPARK_HOME dir.

See "running standalone app" here:
http://spark.apache.org/docs/latest/quick-start.html

-kr, Gerard.





On Tue, May 20, 2014 at 5:01 PM, Greg <g...@zooniverse.org> wrote:

> Hi,
> I have the following Scala code:
> ===---
> import org.apache.spark.SparkContext
>
> class test {
>   def main(){
>     val sc = new SparkContext("local", "Scala Word Count")
>   }
> }
> ===---
> and the following build.sbt file
> ===---
> name := "test"
>
> version := "1.0"
>
> scalaVersion := "2.10.4"
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"
>
> libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "1.0.4"
>
> libraryDependencies += "org.mongodb" % "mongo-java-driver" % "2.11.4"
>
> libraryDependencies += "org.mongodb" % "mongo-hadoop-core" % "1.0.0"
>
> resolvers += "Akka Repository" at "http://repo.akka.io/releases/
> ===---
> I get the following error:
> com.typesafe.config.ConfigException$Missing: No configuration setting found
> for key 'akka.version'
>         at
>
> com.typesafe.config.impl.SimpleConfig.findKey(test.sc3587202794988350330.tmp:111)
>         at
>
> com.typesafe.config.impl.SimpleConfig.find(test.sc3587202794988350330.tmp:132)
>
>
> Any suggestions on how to fix this?
> thanks, Greg
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/issue-with-Scala-Spark-and-Akka-tp6103.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to