Re: Any way to get raw score from MultilayerPerceptronClassificationModel ?

2015-11-17 Thread Robert Dodier
On Tue, Nov 17, 2015 at 2:36 PM, Ulanov, Alexander
 wrote:

> Raw scores are not available through the public API.
> It would be great to add this feature, it seems that we overlooked it.

OK, thanks for the info.

> The better way would be to write a new implementation of MLP
> model that will extend Classifier (instead of Predictor).

Actually I think it would be better still to extend ProbabilisticClassifier,
since MLP classifier outputs are posterior class probabilities.

best,

Robert Dodier

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Any way to get raw score from MultilayerPerceptronClassificationModel ?

2015-11-17 Thread Robert Dodier
Hi,

I'd like to get the raw prediction score from a
MultilayerPerceptronClassificationModel. It appears that the 'predict'
method only returns the argmax of the largest score in the output
layer (line 200 in MultilayerPerceptronClassificationModel.scala in
Spark 1.5.2).

Is there any way to get the raw score? It is computed as
mlpModel.predict(features) in the source code. It appears that I can't
access mlpModel since it is private in
MultilayerPerceptronClassificationModel so I couldn't just grab
mlpModel and call predict on it. Is there another way?

Thanks for any light you can shed on this question.

Robert Dodier

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



spark-shell :javap fails with complaint about JAVA_HOME, but it is set correctly

2015-10-14 Thread Robert Dodier
Hi,

I am working with Spark 1.5.1 (official release), with Oracle Java8,
on Ubuntu 14.04. echo $JAVA_HOME says "/usr/lib/jvm/java-8-oracle".

I'd like to use :javap in spark-shell, but I get an error message:

scala> :javap java.lang.Object
Failed: Could not load javap tool. Check that JAVA_HOME is correct.

However ls $JAVA_HOME/lib/tools.jar shows that it is there.

I tried starting spark-shell with -toolcp $JAVA_HOME/lib/tools.jar but
I get the same error.

For comparison, if execute scala and enter :javap java.lang.Object, it
works as expected.

Not sure where to go from here. Thanks for any advice.

best,

Robert Dodier

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org