I don't think this is Spark-specific. The first definition doesn't
compile, even if java.util.Vector was intended.

Why not just

def parseVector(line: String) = line.split(',').map(_.toFloat)

it returns an Array[Float], which might be entirely fine where you
want a scala Vector.
--
Sean Owen | Director, Data Science | London


On Mon, Feb 17, 2014 at 7:58 PM, agg <[email protected]> wrote:
> Hi,
>
> I would like to run the spark example with floats instead of doubles.  When
> I change this:
>
>   def parseVector(line: String): Vector = {
>     return new Vector(line.split(',').map(_.toDouble))
>   }
>
> to:
>
>   def parseVector(line: String): Vector = {
>     return new Vector(line.split(',').map(_.*toFloat*))
>   }
>
> I get an error, saying it is expecting a double.  Any thoughts?
>
> Thanks!
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Kmeans-example-with-floats-tp1640.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to