Thanks for your answers. I added some lines to my code and it went through,
but I get a error message for my compute cost function now...

scala> val WSSSE = model.computeCost(train)14/08/08 15:48:42 WARN
BlockManagerMasterActor: Removing BlockManager BlockManagerId(<driver>,
192.168.0.33, 49242, 0) with no recent heart beats: 156207ms exceeds 45000ms
14/08/08 15:48:42 INFO BlockManager: BlockManager re-registering with master
14/08/08 15:48:42 INFO BlockManagerMaster: Trying to register BlockManager
14/08/08 15:48:42 INFO BlockManagerInfo: Registering block manager
192.168.0.33:49242 with 303.4 MB RAM
14/08/08 15:48:42 INFO BlockManagerMaster: Registered BlockManager
14/08/08 15:48:42 INFO BlockManager: Reporting 0 blocks to the master.

<console>:30: error: value computeCost is not a member of
org.apache.spark.mllib.clustering.KMeans
       val WSSSE = model.computeCost(train)

compute cost should be a member of KMeans isn't it?

My whole code is here:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

val conf = new SparkConf()
.setMaster("local")
.setAppName("Kmeans")
.set("spark.executor.memory", "2g")
val sc = new SparkContext(conf)



import org.apache.spark.mllib.clustering.KMeans
import org.apache.spark.mllib.clustering.KMeansModel
import org.apache.spark.mllib.linalg.Vectors

// Load and parse the data
val data = sc.textFile("data/outkmeanssm.txt")
val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble)))
val train = parsedData.repartition(20).cache() 

// Set model and run it
val model = new KMeans()
.setInitializationMode("k-means||")
.setK(2)
.setMaxIterations(2)
.setEpsilon(1e-4)
.setRuns(1)
.run(train)

// Evaluate clustering by computing Within Set Sum of Squared Errors
val WSSSE = model.computeCost(train)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/KMeans-Input-Format-tp11654p11788.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to