The return type should be RDD[(Int, Int, Int)] because sc.textFile()
returns an RDD.  Try adding an import for the RDD type to get rid of the
compile error.

import org.apache.spark.rdd.RDD


On Mon, Apr 28, 2014 at 6:22 PM, SK <skrishna...@gmail.com> wrote:

> Hi,
>
> I am a new user of Spark. I have a class that defines a function as
> follows.
> It returns a tuple : (Int, Int, Int).
>
> class Sim extends VectorSim {
>      override def  input(master:String): (Int,Int,Int) = {
>             sc = new SparkContext(master, "Test")
>             val ratings = sc.textFile(INP_FILE)
>                       .map(line=> {
>                         val fields = line.split("\t")
>                         (fields(0).toInt, fields(1).toInt, fields(2).toInt)
>                       })
>             ratings
>       }
> }
>
> The class extends the trait VectorSim where the function  input() is
> declared as follows.
>
> trait VectorSim {
>   def input (s:String): (Int, Int, Int)
> }
>
> However, when I compile, I get a type mismatch saying input() returns
> RDD[(Int,Int,Int)]. So I changed the return type to RDD[(Int,Int,Int)], but
> the compiler complains that there is no type called RDD. What is the right
> way to  declare when the return type of a function is  a tuple that is
> (Int,Int,Int).
>
> I am using spark 0.9.
>
> thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-declare-Tuple-return-type-for-a-function-tp4999.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to