[ https://issues.apache.org/jira/browse/SPARK-18230?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15630741#comment-15630741 ]
yuhao yang commented on SPARK-18230: ------------------------------------ Perhaps we can use Double.NaN for the case, just to be consistent with spark.ml.als. [~mikaelstaldal] Do you plan to send a PR for this? > MatrixFactorizationModel.recommendProducts throws NoSuchElement exception > when the user does not exist > ------------------------------------------------------------------------------------------------------ > > Key: SPARK-18230 > URL: https://issues.apache.org/jira/browse/SPARK-18230 > Project: Spark > Issue Type: Improvement > Components: MLlib > Affects Versions: 2.0.1 > Reporter: Mikael Ståldal > Priority: Minor > > When invoking {{MatrixFactorizationModel.recommendProducts(Int, Int)}} with a > non-existing user, a {{java.util.NoSuchElementException}} is thrown: > {code} > java.util.NoSuchElementException: next on empty iterator > at scala.collection.Iterator$$anon$2.next(Iterator.scala:39) > at scala.collection.Iterator$$anon$2.next(Iterator.scala:37) > at > scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63) > at scala.collection.IterableLike$class.head(IterableLike.scala:107) > at > scala.collection.mutable.WrappedArray.scala$collection$IndexedSeqOptimized$$super$head(WrappedArray.scala:35) > at > scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126) > at scala.collection.mutable.WrappedArray.head(WrappedArray.scala:35) > at > org.apache.spark.mllib.recommendation.MatrixFactorizationModel.recommendProducts(MatrixFactorizationModel.scala:169) > {code} > It would be nice if it returned the empty array, or throwed a more specific > exception, and that was documented in ScalaDoc for the method. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org