Thanks Xiangrui, Matei and Arpit. It does work fine after adding
Vector.dense. I have a follow up question, I will post on a new thread.


On Thu, Apr 24, 2014 at 2:49 AM, Arpit Tak <arpi...@sigmoidanalytics.com>wrote:

> Also try out these examples, all of them works....
>
> http://docs.sigmoidanalytics.com/index.php/MLlib
>
> if you spot any problems in those, let us know.
>
> Regards,
> arpit
>
>
> On Wed, Apr 23, 2014 at 11:08 PM, Matei Zaharia 
> <matei.zaha...@gmail.com>wrote:
>
>> See http://people.csail.mit.edu/matei/spark-unified-docs/ for a more
>> recent build of the docs; if you spot any problems in those, let us know.
>>
>> Matei
>>
>> On Apr 23, 2014, at 9:49 AM, Xiangrui Meng <men...@gmail.com> wrote:
>>
>> > The doc is for 0.9.1. You are running a later snapshot, which added
>> > sparse vectors. Try LabeledPoint(parts(0).toDouble,
>> > Vectors.dense(parts(1).split(' ').map(x => x.toDouble)). The examples
>> > are updated in the master branch. You can also check the examples
>> > there. -Xiangrui
>> >
>> > On Wed, Apr 23, 2014 at 9:34 AM, Mohit Jaggi <mohitja...@gmail.com>
>> wrote:
>> >>
>> >> sorry...added a subject now
>> >>
>> >> On Wed, Apr 23, 2014 at 9:32 AM, Mohit Jaggi <mohitja...@gmail.com>
>> wrote:
>> >>>
>> >>> I am trying to run the example linear regression code from
>> >>>
>> >>> http://spark.apache.org/docs/latest/mllib-guide.html
>> >>>
>> >>> But I am getting the following error...am I missing an import?
>> >>>
>> >>> ----code----
>> >>>
>> >>> import org.apache.spark._
>> >>>
>> >>> import org.apache.spark.mllib.regression.LinearRegressionWithSGD
>> >>>
>> >>> import org.apache.spark.mllib.regression.LabeledPoint
>> >>>
>> >>>
>> >>> object ModelLR {
>> >>>
>> >>>  def main(args: Array[String]) {
>> >>>
>> >>>    val sc = new SparkContext(args(0), "SparkLR",
>> >>>
>> >>>      System.getenv("SPARK_HOME"),
>> >>> SparkContext.jarOfClass(this.getClass).toSeq)
>> >>>
>> >>> // Load and parse the data
>> >>>
>> >>> val data = sc.textFile("mllib/data/ridge-data/lpsa.data")
>> >>>
>> >>> val parsedData = data.map { line =>
>> >>>
>> >>>  val parts = line.split(',')
>> >>>
>> >>>  LabeledPoint(parts(0).toDouble, parts(1).split(' ').map(x =>
>> >>> x.toDouble).toArray)
>> >>>
>> >>> }
>> >>>
>> >>> ...<snip>...
>> >>>
>> >>> }
>> >>>
>> >>> ----error----
>> >>>
>> >>> - polymorphic expression cannot be instantiated to expected type;
>> found :
>> >>> [U >: Double]Array[U] required:
>> >>>
>> >>> org.apache.spark.mllib.linalg.Vector
>> >>>
>> >>> - polymorphic expression cannot be instantiated to expected type;
>> found :
>> >>> [U >: Double]Array[U] required:
>> >>>
>> >>> org.apache.spark.mllib.linalg.Vector
>> >>
>> >>
>>
>>
>

Reply via email to