[
https://issues.apache.org/jira/browse/SPARK-8400?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14625483#comment-14625483
]
Bryan Cutler commented on SPARK-8400:
-------------------------------------
Hi [~mengxr], just in case you missed my comment in the PR about adding an
error message in branch 1.3 for this:
{quote}
{noformat}
using branch-1.3 I gave ALS a -1 block size and got the following exception:
[info] ALSSuite:
[info] - more blocks than ratings *** FAILED *** (1 second, 112 milliseconds)
[info] java.lang.IllegalArgumentException: requirement failed: numBlocks must
be positive but found -1.
[info] at scala.Predef$.require(Predef.scala:233)
[info] at
org.apache.spark.ml.recommendation.ALS$LocalIndexEncoder.(ALS.scala:1164)
Seems like it's already fixed to me, what do you think?
{noformat}
{quote}
> ml.ALS doesn't handle -1 block size
> -----------------------------------
>
> Key: SPARK-8400
> URL: https://issues.apache.org/jira/browse/SPARK-8400
> Project: Spark
> Issue Type: Bug
> Components: ML
> Affects Versions: 1.3.1
> Reporter: Xiangrui Meng
> Assignee: Bryan Cutler
>
> Under spark.mllib, if number blocks is set to -1, we set the block size
> automatically based on the input partition size. However, this behavior is
> not preserved in the spark.ml API. If user sets -1 in Spark 1.3, it will not
> work, but no error messages will show.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]