[ 
https://issues.apache.org/jira/browse/SPARK-1945?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14047018#comment-14047018
 ] 

Michael Yannakopoulos commented on SPARK-1945:
----------------------------------------------

Hi guys,

I have a difficulty transforming the code snippet provided for SVMWithSGD
[http://spark.apache.org/docs/latest/mllib-linear-methods.html#linear-support-vector-machine-svm]
 to Java.
More specifically, I cannot figure out what type of parameters are accepted by 
the _constructor_ of *BinaryClassificationMetrics*.
>From the ScalaDoc 
>[http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.mllib.evaluation.BinaryClassificationMetrics],
> 
I see that the constructor need to take an argument of type RDD[(Double, 
Double)] which I suspect that translates into
JavaRDD<Tuple2<Double, Double>> or JavaRDD<Product2<Double, Double>> or 
JavaRDD<Product<Double, Double>>.
However none of the above seem to work. Any suggestions?

> Add full Java examples in MLlib docs
> ------------------------------------
>
>                 Key: SPARK-1945
>                 URL: https://issues.apache.org/jira/browse/SPARK-1945
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Documentation, MLlib
>            Reporter: Matei Zaharia
>              Labels: Starter
>             Fix For: 1.0.0
>
>
> Right now some of the Java tabs only say the following:
> "All of MLlib’s methods use Java-friendly types, so you can import and call 
> them there the same way you do in Scala. The only caveat is that the methods 
> take Scala RDD objects, while the Spark Java API uses a separate JavaRDD 
> class. You can convert a Java RDD to a Scala one by calling .rdd() on your 
> JavaRDD object."
> Would be nice to translate the Scala code into Java instead.
> Also, a few pages (most notably the Matrix one) don't have Java examples at 
> all.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to