Hello,

I'm facing the following problem when trying to compile Spark 0.8.0 (sbt/sbt 
assembly) on Solaris.

[info] Compiling 247 Scala sources and 11 Java sources to 
/export/home/mnikolic/spark-0.8.0-incubating/core/target/scala-2.9.3/classes...
...
[error] 
/export/home/mnikolic/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkContext.scala:58:
 StandaloneSchedulerBackend is not a member of 
org.apache.spark.scheduler.cluster
[error] import org.apache.spark.scheduler.cluster.{StandaloneSchedulerBackend, 
SparkDeploySchedulerBackend,
[error]        ^
[error] 
/export/home/mnikolic/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkContext.scala:61:
 object mesos is not a member of package org.apache.spark.scheduler
[error] import org.apache.spark.scheduler.mesos.{CoarseMesosSchedulerBackend, 
MesosSchedulerBackend}
[error]                                   ^

Both sbt and maven fail at the same point. I tried to compile with Java 1.6 and 
1.7 and also with the newest version of sbt. Failed in all cases.


Thanks in advance,
Milos

Reply via email to