Hi, I'm learning Spark 0.9 by its tutorial. To write my first application in Scala, I followed the instructions of A Standalong App in Scala<http://spark.incubator.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala> but failed when compiling my application. To be specific, in Spark's home directory(/home/soft/spark-0.9.0-incubating-bin-hadoop1), I created a directory src/main/scala and put SimpleApp.scala in it and put simple.sbt in Spark's home directory.
Then I tried to compile my application with the command "sbt/sbt package" and got the following errors: [root@dev4 spark-0.9.0-incubating-bin-hadoop1]# sbt/sbt package Launching sbt from sbt/sbt-launch-0.12.4.jar [info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project/project [info] Loading project definition from /home/soft/spark-0.9.0-incubating-bin-hadoop1/project [info] Set current project to Simple App Project (in build file:/home/soft/spark-0.9.0-incubating-bin-hadoop1/) Getting Scala 2.10 ... :: problems summary :: :::: WARNINGS module not found: org.scala-lang#scala-compiler;2.10 ==== local: tried /root/.ivy2/local/org.scala-lang/scala-compiler/2.10/ivys/ivy.xml ==== typesafe-ivy-releases: tried http://repo.typesafe.com/typesafe/ivy-releases/org.scala-lang/scala-compiler/2.10/ivys/ivy.xml ==== Maven Central: tried http://repo1.maven.org/maven2/org/scala-lang/scala-compiler/2.10/scala-compiler-2.10.pom ==== sonatype-snapshots: tried https://oss.sonatype.org/content/repositories/snapshots/org/scala-lang/scala-compiler/2.10/scala-compiler-2.10.pom module not found: org.scala-lang#scala-library;2.10 ==== local: tried /root/.ivy2/local/org.scala-lang/scala-library/2.10/ivys/ivy.xml ==== typesafe-ivy-releases: tried http://repo.typesafe.com/typesafe/ivy-releases/org.scala-lang/scala-library/2.10/ivys/ivy.xml ==== Maven Central: tried http://repo1.maven.org/maven2/org/scala-lang/scala-library/2.10/scala-library-2.10.pom ==== sonatype-snapshots: tried https://oss.sonatype.org/content/repositories/snapshots/org/scala-lang/scala-library/2.10/scala-library-2.10.pom :::::::::::::::::::::::::::::::::::::::::::::: :: UNRESOLVED DEPENDENCIES :: :::::::::::::::::::::::::::::::::::::::::::::: :: org.scala-lang#scala-compiler;2.10: not found :: org.scala-lang#scala-library;2.10: not found :::::::::::::::::::::::::::::::::::::::::::::: :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS unresolved dependency: org.scala-lang#scala-compiler;2.10: not found unresolved dependency: org.scala-lang#scala-library;2.10: not found Error during sbt execution: Error retrieving required libraries (see /root/.sbt/boot/update.log for complete log) xsbti.RetrieveException: Could not retrieve Scala 2.10 at xsbt.boot.ModuleDefinition.fail(ResolverHelper.java:12) at xsbt.boot.ModuleDefinition.retrieveFailed(ResolverHelper.java:9) at xsbt.boot.Launch.update(Launch.scala:267) at xsbt.boot.Launch$$anonfun$xsbt$boot$Launch$$getScalaProvider0$3.apply(Launch.scala:181) at scala.Option.getOrElse(Option.scala:108) at xsbt.boot.Launch$$anon$3.call(Launch.scala:167) at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:98) at xsbt.boot.Locks$GlobalLock.withChannelRetries$1(Locks.scala:81) at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:102) at xsbt.boot.Using$.withResource(Using.scala:11) at xsbt.boot.Using$.apply(Using.scala:10) at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:62) at xsbt.boot.Locks$GlobalLock.liftedTree1$1(Locks.scala:52) at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:52) at xsbt.boot.Locks$.apply0(Locks.scala:31) at xsbt.boot.Locks$.apply(Locks.scala:28) at xsbt.boot.Launch.locked(Launch.scala:165) at xsbt.boot.Launch.getScalaProvider(Launch.scala:167) at xsbt.boot.Launch$$anonfun$1.apply(Launch.scala:76) at org.apache.ivy.plugins.namespace.NamespaceRule.newEntry(Cache.scala:17) at org.apache.ivy.plugins.namespace.NamespaceRule.apply(Cache.scala:12) at xsbt.boot.Launch.getScala(Launch.scala:79) at xsbt.boot.Launch.getScala(Launch.scala:78) at xsbt.boot.Launch.getScala(Launch.scala:77) at sbt.ScalaInstance$.apply(ScalaInstance.scala:43) at sbt.ScalaInstance$.apply(ScalaInstance.scala:34) at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:272) at sbt.Defaults$$anonfun$scalaInstanceSetting$1.apply(Defaults.scala:269) at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580) at sbt.Scoped$$anonfun$hf4$1.apply(Structure.scala:580) at scala.Function1$$anonfun$compose$1.apply(Function1.scala:49) at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311) at sbt.Scoped$Reduced$$anonfun$combine$1$$anonfun$apply$12.apply(Structure.scala:311) at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:41) at sbt.std.Transform$$anon$5.work(System.scala:71) at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232) at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:232) at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18) at sbt.Execute.work(Execute.scala:238) at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232) at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:232) at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160) at sbt.CompletionService$$anon$2.call(CompletionService.scala:30) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) at java.lang.Thread.run(Thread.java:662) [error] (root/*:scala-instance) xsbti.RetrieveException: Could not retrieve Scala 2.10 [error] Total time: 9 s, completed Feb 19, 2014 5:12:37 PM It seems scala-2.10 was not available, but Spark-0.9 comes with Scala-2.10 included, doesn't it ? How to solve this problem and what is the correct way to compile Spark application written in Scala/Java?