Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12660#discussion_r61145104
  
    --- Diff: 
mllib/src/test/scala/org/apache/spark/ml/recommendation/ALSSuite.scala ---
    @@ -512,6 +514,55 @@ class ALSSuite
         assert(getFactors(model.userFactors) === 
getFactors(model2.userFactors))
         assert(getFactors(model.itemFactors) === 
getFactors(model2.itemFactors))
       }
    +
    +  test("StorageLevel param") {
    +    // test invalid param values
    +    intercept[IllegalArgumentException] {
    +      new ALS().setIntermediateRDDStorageLevel("foo")
    +    }
    +    intercept[IllegalArgumentException] {
    +      new ALS().setIntermediateRDDStorageLevel("NONE")
    +    }
    +    intercept[IllegalArgumentException] {
    +      new ALS().setFinalRDDStorageLevel("foo")
    +    }
    +    // test StorageLevels
    +    val sqlContext = this.sqlContext
    +    import sqlContext.implicits._
    +    val (ratings, _) = genExplicitTestData(numUsers = 2, numItems = 2, 
rank = 1)
    +    val data = ratings.toDF
    +    val als = new ALS().setMaxIter(1)
    +    als.fit(data)
    +    val factorRDD = sc.getPersistentRDDs.collect {
    --- End diff --
    
    So I don't think that is quite the case - in the intermediate stages of the 
ALS algorithm are working with RDDs rather than DataFrames and this test is 
asserting what the storage level should be for the intermediate RDDs.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to