Hi Swaroop,
from my understanding, Isotonic Regression is currently limited to data
with 1 feature plus weight and label. Also the entire data is required
to fit into memory of a single machine.
I did some work on the latter issue but discontinued the project,
because I felt no one really needed it. I'd be happy to resume my work
on Spark's IR implementation, but I fear there won't be a quick for your
issue.
Fridtjof
Am 08.07.2016 um 22:38 schrieb dsp:
Hi I am trying to perform Isotonic Regression on a data set with 9 features
and a label.
When I run the algorithm similar to the way mentioned on MLlib page, I get
the error saying
/*error:* overloaded method value run with alternatives:
(input: org.apache.spark.api.java.JavaRDD[(java.lang.Double,
java.lang.Double,
java.lang.Double)])org.apache.spark.mllib.regression.IsotonicRegressionModel
<and>
(input: org.apache.spark.rdd.RDD[(scala.Double, scala.Double,
scala.Double)])org.apache.spark.mllib.regression.IsotonicRegressionModel
cannot be applied to (org.apache.spark.rdd.RDD[(scala.Double, scala.Double,
scala.Double, scala.Double, scala.Double, scala.Double, scala.Double,
scala.Double, scala.Double, scala.Double, scala.Double, scala.Double,
scala.Double)])
val model = new
IsotonicRegression().setIsotonic(true).run(training)/
For the may given in the sample code, it looks like it can be done only for
dataset with a single feature because run() method can accept only three
parameters leaving which already has a label and a default value leaving
place for only one variable.
So, How can this be done for multiple variables ?
Regards,
Swaroop
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Isotonic-Regression-run-method-overloaded-Error-tp27313.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org