Ah cool, thanks for the link!

On 6 December 2016 at 12:25, Nick Pentreath <nick.pentre...@gmail.com>
wrote:

> Indeed, it's being tracked here: https://issues.apache.
> org/jira/browse/SPARK-18230 though no Pr has been opened yet.
>
>
> On Tue, 6 Dec 2016 at 13:36 chris snow <chsnow...@gmail.com> wrote:
>
>> I'm using the MatrixFactorizationModel.predict() method and encountered
>> the following exception:
>>
>> Name: java.util.NoSuchElementException
>> Message: next on empty iterator
>> StackTrace: scala.collection.Iterator$$anon$2.next(Iterator.scala:39)
>> scala.collection.Iterator$$anon$2.next(Iterator.scala:37)
>> scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:64)
>> scala.collection.IterableLike$class.head(IterableLike.scala:91)
>> scala.collection.mutable.ArrayBuffer.scala$collection$
>> IndexedSeqOptimized$$super$head(ArrayBuffer.scala:47)
>> scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.
>> scala:120)
>> scala.collection.mutable.ArrayBuffer.head(ArrayBuffer.scala:47)
>> org.apache.spark.mllib.recommendation.MatrixFactorizationModel.predict(
>> MatrixFactorizationModel.scala:81)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$
>> iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:74)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$
>> iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$
>> iwC$$iwC$$iwC.<init>(<console>:81)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$
>> iwC$$iwC.<init>(<console>:83)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$
>> iwC.<init>(<console>:85)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<
>> init>(<console>:87)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>
>> (<console>:89)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:91)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:93)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:95)
>> $line78.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:97)
>> $line78.$read$$iwC$$iwC$$iwC.<init>(<console>:99)
>> $line78.$read$$iwC$$iwC.<init>(<console>:101)
>> $line78.$read$$iwC.<init>(<console>:103)
>> $line78.$read.<init>(<console>:105)
>> $line78.$read$.<init>(<console>:109)
>> $line78.$read$.<clinit>(<console>)
>> $line78.$eval$.<init>(<console>:7)
>> $line78.$eval$.<clinit>(<console>)
>> $line78.$eval.$print(<console>)
>> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> sun.reflect.NativeMethodAccessorImpl.invoke(
>> NativeMethodAccessorImpl.java:95)
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(
>> DelegatingMethodAccessorImpl.java:55)
>> java.lang.reflect.Method.invoke(Method.java:507)
>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(
>> SparkIMain.scala:1065)
>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(
>> SparkIMain.scala:1346)
>> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>> com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$
>> interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:296)
>> com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$
>> interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:291)
>> com.ibm.spark.global.StreamState$.withStreams(StreamState.scala:80)
>> com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$
>> interpretAddTask$1.apply(ScalaInterpreter.scala:290)
>> com.ibm.spark.interpreter.ScalaInterpreter$$anonfun$
>> interpretAddTask$1.apply(ScalaInterpreter.scala:290)
>> com.ibm.spark.utils.TaskManager$$anonfun$add$2$$
>> anon$1.run(TaskManager.scala:123)
>> java.util.concurrent.ThreadPoolExecutor.runWorker(
>> ThreadPoolExecutor.java:1153)
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(
>> ThreadPoolExecutor.java:628)
>> java.lang.Thread.run(Thread.java:785)
>>
>> This took some debugging to figure out why I received the Exception, but
>> when looking at the predict() implementation, I seems to assume that there
>> will always be features found for the provided user and product ids:
>>
>>
>>   /** Predict the rating of one user for one product. */
>>   @Since("0.8.0")
>>   def predict(user: Int, product: Int): Double = {
>>     val userVector = userFeatures.lookup(user).head
>>     val productVector = productFeatures.lookup(product).head
>>     blas.ddot(rank, userVector, 1, productVector, 1)
>>   }
>>
>> It would be helpful if a more useful exception was raised, e.g.
>>
>> MissingUserFeatureException : "User ID ${user} not found in model"
>> MissingProductFeatureException : "Product ID ${product} not found in
>> model"
>>
>> WDYT?
>>
>>
>>
>>

Reply via email to