You need to set your libraryDependencies to include the "spark-graphx"
artifact.

We should add a note to the graphx and mllib pages to include linking
instructions (like we have for streaming:
http://spark.incubator.apache.org/docs/latest/streaming-programming-guide.html#linking
)


On Mon, Feb 17, 2014 at 9:02 AM, xben <[email protected]> wrote:

> Hello,
>
> I'm trying to build a very simple scala standalone app using the graphx
> libraby. I basically copy/pasted the trangle count example and written the
> sbt file but I get the following error when trying to bulid the program:
>
> [error]
>
> /data/home/benjamin/spark/spark-0.9.0-incubating-bin-hadoop2/ben_new/src/main/scala/testGraph.scala:2:
> object graphx is not a member of package org.apache.spark
> [error] import org.apache.spark.graphx._
> [error]                         ^
> [error]
>
> /data/home/benjamin/spark/spark-0.9.0-incubating-bin-hadoop2/ben_new/src/main/scala/testGraph.scala:16:
> not found: value GraphLoader
> [error]     val graph = GraphLoader.edgeListFile(sc, LINKS,
> true).partitionBy(PartitionStrategy.RandomVertexCut)
> [error]                 ^
> [error] two errors found
> [error] (compile:compile) Compilation failed
> [error] Total time: 13 s, completed Feb 17, 2014 4:47:37 PM
>
> here is the sbt file i'm using:
>
> name := "Graph Test"
>
> version := "1.0"
>
> scalaVersion := "2.10.0"
>
> libraryDependencies += "org.apache.spark" %% "spark-core" %
> "0.9.0-incubating"
>
> libraryDependencies += "org.apache.hadoop" % "hadoop-client" %
> "2.0.0-cdh4.4.0"
>
> resolvers += "Akka Repository" at "http://repo.akka.io/releases/";
>
> Any idea on what's wrong ghere?
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Building-a-Standalone-App-in-Scala-and-graphX-tp1622.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to