For some more notes on how to debug this: After you do publish-local in
Spark, you should have a file in ~/.ivy2 that you can check for using
`ls
~/.ivy2/local/org.apache.spark/spark-core_2.9.3/0.8.0-SNAPSHOT/jars/spark-core_2.9.3.jar`

Or `sbt/sbt publish-local` also prints something like this on the console

 [info]  published spark-core_2.9.3 to
/home/shivaram/.ivy2/local/org.apache.spark/spark-core_2.9.3/0.8.0-SNAPSHOT/jars/spark-core_2.9.3.jar

After that MLI's build should be able to pick this jar up.

Thanks
Shivaram




On Tue, Sep 10, 2013 at 1:14 PM, Gowtham N <[email protected]> wrote:

> I did it as publish-local.
> I forked mesos/spark to gowthamnatarajan/spark. And I am using that. I
> forked a few days ago, but did a upstream update today.
>
> For safety, I will directly clone from mesos now.
>
>
>
> On Tue, Sep 10, 2013 at 1:10 PM, Shivaram Venkataraman <
> [email protected]> wrote:
>
>> Did you check out spark from the master branch of github.com/mesos/spark?
>> The package names changed recently so you might need to pull. Also just
>> checking that you did publish-local in Spark (not public-local as
>> specified
>> in the email) ?
>>
>> Thanks
>> Shivaram
>>
>>
>> On Tue, Sep 10, 2013 at 1:01 PM, Gowtham N <[email protected]>
>> wrote:
>>
>> > still getting the same error.
>> >
>> > I have spark and MLI folder within a folder called git
>> >
>> > I did clean, package and public-local for spark.
>> > Then for mli did clean, and then package.
>> > I am still getting the error.
>> >
>> > [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>> > [warn] ::          UNRESOLVED DEPENDENCIES         ::
>> > [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>> > [warn] :: org.apache.spark#spark-core_2.9.3;0.8.0-SNAPSHOT: not found
>> > [warn] :: org.apache.spark#spark-mllib_2.9.3;0.8.0-SNAPSHOT: not found
>> > [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>> > [error] {file:/Users/gowthamn/git/MLI/}default-0b9403/*:update:
>> > sbt.ResolveException: unresolved dependency:
>> > org.apache.spark#spark-core_2.9.3;0.8.0-SNAPSHOT: not found
>> > [error] unresolved dependency:
>> > org.apache.spark#spark-mllib_2.9.3;0.8.0-SNAPSHOT: not found
>> >
>> > should I modify the contents of build.sbt?
>> > Currently its
>> >
>> > libraryDependencies ++= Seq(
>> >   "org.apache.spark" % "spark-core_2.9.3" % "0.8.0-SNAPSHOT",
>> >   "org.apache.spark" % "spark-mllib_2.9.3" % "0.8.0-SNAPSHOT",
>> >   "org.scalatest" %% "scalatest" % "1.9.1" % "test"
>> > )
>> >
>> > resolvers ++= Seq(
>> >   "Typesafe" at "http://repo.typesafe.com/typesafe/releases";,
>> >   "Scala Tools Snapshots" at "http://scala-tools.org/repo-snapshots/";,
>> >   "ScalaNLP Maven2" at "http://repo.scalanlp.org/repo";,
>> >   "Spray" at "http://repo.spray.cc";
>> > )
>> >
>> >
>> >
>> >
>> >
>> >
>> > On Tue, Sep 10, 2013 at 11:58 AM, Evan R. Sparks <[email protected]
>> > >wrote:
>> >
>> > > Hi Gowtham,
>> > >
>> > > You'll need to do "sbt/sbt publish-local" in the spark directory
>> > > before trying to build MLI.
>> > >
>> > > - Evan
>> > >
>> > > On Tue, Sep 10, 2013 at 11:37 AM, Gowtham N <[email protected]
>> >
>> > > wrote:
>> > > > I cloned MLI, but am unable to compile it.
>> > > >
>> > > > I get the following dependency exception with other projects.
>> > > >
>> > > > org.apache.spark#spark-core_2.9.3;0.8.0-SNAPSHOT: not found
>> > > > org.apache.spark#spark-mllib_2.9.3;0.8.0-SNAPSHOT: not found
>> > > >
>> > > > Why am I getting this error?
>> > > >
>> > > > I did not change anything from build.sbt
>> > > >
>> > > > libraryDependencies ++= Seq(
>> > > >   "org.apache.spark" % "spark-core_2.9.3" % "0.8.0-SNAPSHOT",
>> > > >   "org.apache.spark" % "spark-mllib_2.9.3" % "0.8.0-SNAPSHOT",
>> > > >   "org.scalatest" %% "scalatest" % "1.9.1" % "test"
>> > > > )
>> > >
>> >
>> >
>> >
>> > --
>> > Gowtham Natarajan
>> >
>>
>
>
>
> --
> Gowtham Natarajan
>

Reply via email to