Looks like you should define ctor for ExtendedLR which accepts String (the
uid).

Cheers

On Tue, Dec 22, 2015 at 1:04 PM, njoshi <nikhil.jo...@teamaol.com> wrote:

> Hi,
>
> I have a custom extended LogisticRegression model which I want to test
> against a parameter grid search. I am running as follows:
>
> /
>    val exLR = new ExtendedLR()
>       .setMaxIter(100)
>       .setFitIntercept(true)
>
>     /*
>      * Cross Validator parameter grid
>      */
>     val paramGrid = new ParamGridBuilder()
>       .addGrid(exLR.regParam, Array(1e-7, 1e-6, 1e-5, 1e-4, 1e-3, 2e-3,
> 1e-2, 1e-1, 0.001341682))
>       .addGrid(exLR.elasticNetParam, Array(0.95))
>       .build()
>
>
>     /*
>      * Perform cross validation over the parameters
>      */
>     val cv = new CrossValidator()
>       .setEstimator(exLR)
>       .setEvaluator(new BinaryClassificationEvaluator)
>       .setEstimatorParamMaps(paramGrid)
>       .setNumFolds(10)
>
>     /*
>      * Run the grid search and pick up the best model
>      */
>     val bestModel = cv.fit(trainingData)
>      .bestModel.asInstanceOf[ExtendedLRModel]
> /
>
> While run individually  (exLR.fit(trainingData) way)  it works fine, the
> crossValidation code produces the following error.
>
> /
> java.lang.NoSuchMethodException:
> org.apache.spark.ml.classification.ExtendedLR.<init>(java.lang.String)
>         at java.lang.Class.getConstructor0(Class.java:3082)
>         at java.lang.Class.getConstructor(Class.java:1825)
>         at
> org.apache.spark.ml.param.Params$class.defaultCopy(params.scala:529)
>         at org.apache.spark.ml.PipelineStage.defaultCopy(Pipeline.scala:37)
>         at
>
> org.apache.spark.ml.classification.ExtendedLR.copy(FactorizationMachine.scala:434)
>         at
>
> org.apache.spark.ml.classification.ExtendedLR.copy(FactorizationMachine.scala:156)
>         at org.apache.spark.ml.Estimator.fit(Estimator.scala:59)
>         at
> org.apache.spark.ml.Estimator$$anonfun$fit$1.apply(Estimator.scala:78)
>         at
> org.apache.spark.ml.Estimator$$anonfun$fit$1.apply(Estimator.scala:78)
>         at
>
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
>         at
>
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
>         at
>
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>         at
> scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
>         at
> scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
>         at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
>         at org.apache.spark.ml.Estimator.fit(Estimator.scala:78)
>         at
>
> org.apache.spark.ml.tuning.CrossValidator$$anonfun$fit$1.apply(CrossValidator.scala:89)
>         at
>
> org.apache.spark.ml.tuning.CrossValidator$$anonfun$fit$1.apply(CrossValidator.scala:84)
>         at
>
> scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
>         at
> scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
>         at
> org.apache.spark.ml.tuning.CrossValidator.fit(CrossValidator.scala:84)
>         at com.aol.advertising.ml.Driver$.main(Driver.scala:244)
>         at com.aol.advertising.ml.Driver.main(Driver.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:497)
> /
>
> Is there anything, such as implicits, I need to add someplace?
> Note, *ExtendedLR* has exact same inheritance tree.
>
> Thanks in advance,
> Nikhil
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Classification-model-init-method-not-found-tp25770.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to