Hi,

I am a new user of Spark. I have a class that defines a function as follows.
It returns a tuple : (Int, Int, Int).

class Sim extends VectorSim {
override def  input(master:String): (Int,Int,Int) = {            
            sc = new SparkContext(master, "Test")
            val ratings = sc.textFile(INP_FILE)
                      .map(line=> {
                        val fields = line.split("\t")
                        (fields(0).toInt, fields(1).toInt, fields(2).toInt)
                      })
            ratings
      }
}

The class extends the trait VectorSim where the function  input() is
declared as follows.

trait VectorSim {
  def input (s:String): (Int, Int, Int)
}

However, when I compile, I get a type mismatch saying input() returns
RDD[(Int,Int,Int)]. So I changed the return type to RDD[(Int,Int,Int)], but
the compiler complains that there is no type called RDD. What is the right
way to  declare the return type for a function that returns a tuple that is
(Int,Int,Int).

I am using spark 0.9.

thanks 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-declare-Tuple-return-type-for-a-function-tp5047.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to