I have seen this on sbt sometimes. I usually do an sbt clean and that fixes it.
Thanks, Hari On Tue, Nov 4, 2014 at 3:13 PM, Nicholas Chammas <nicholas.cham...@gmail.com> wrote: > FWIW, the "official" build instructions are here: > https://github.com/apache/spark#building-spark > On Tue, Nov 4, 2014 at 5:11 PM, Ted Yu <yuzhih...@gmail.com> wrote: >> I built based on this commit today and the build was successful. >> >> What command did you use ? >> >> Cheers >> >> On Tue, Nov 4, 2014 at 2:08 PM, Alessandro Baretta <alexbare...@gmail.com> >> wrote: >> >> > Fellow Sparkers, >> > >> > I am new here and still trying to learn to crawl. Please, bear with me. >> > >> > I just pulled f90ad5d from https://github.com/apache/spark.git and am >> > running the compile command in the sbt shell. This is the error I'm >> seeing: >> > >> > [error] >> > >> > >> /home/alex/git/spark/mllib/src/main/scala/org/apache/spark/mllib/linalg/Vectors.scala:32: >> > object sql is not a member of package org.apache.spark >> > [error] import org.apache.spark.sql.catalyst.types._ >> > [error] ^ >> > >> > Am I doing something obscenely stupid is the build genuinely broken? >> > >> > Alex >> > >>