Hi, I am trying to compile spark 1.1.0 on windows 8.1 but I get the following exception.
[info] Compiling 3 Scala sources to D:\myworkplace\software\spark-1.1.0\project\target\scala-2.10\sbt0.13\classes... [error] D:\myworkplace\software\spark-1.1.0\project\SparkBuild.scala:26: object sbt is not a member of package com.typesafe [error] import com.typesafe.sbt.pom.{PomBuild, SbtPomKeys} [error] ^ [error] D:\myworkplace\software\spark-1.1.0\project\SparkBuild.scala:53: not found: type PomBuild [error] object SparkBuild extends PomBuild { [error] ^ [error] D:\myworkplace\software\spark-1.1.0\project\SparkBuild.scala:121: not found: value SbtPomKeys [error] otherResolvers <<= SbtPomKeys.mvnLocalRepository(dotM2 => Seq(Resolver.file("dotM2", dotM2))), [error] ^ [error] D:\myworkplace\software\spark-1.1.0\project\SparkBuild.scala:165: value projectDefinitions is not a member of AnyRef [error] super.projectDefinitions(baseDirectory).map { x => [error] ^ [error] four errors found [error] (plugins/compile:compile) Compilation failed I have also setup scala 2.10. Need help to resolve this issue. Regards, Ishwardeep -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-compile-spark-1-1-0-on-windows-8-1-tp19996.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org