Github user HeartSaVioR commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21152#discussion_r184213878
  
    --- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/streaming/continuous/ContinuousSuite.scala
 ---
    @@ -66,157 +66,115 @@ class ContinuousSuite extends ContinuousSuiteBase {
         val input = ContinuousMemoryStream[Int]
     
         testStream(input.toDF())(
    -      AddData(input, 0, 1, 2),
    -      CheckAnswer(0, 1, 2),
    +      AddData(input, 0.to(2): _*),
    +      CheckAnswer(0.to(2): _*),
           StopStream,
    -      AddData(input, 3, 4, 5),
    +      AddData(input, 3.to(5): _*),
           StartStream(),
    -      CheckAnswer(0, 1, 2, 3, 4, 5))
    +      CheckAnswer(0.to(5): _*))
       }
     
       test("map") {
    -    val df = spark.readStream
    -      .format("rate")
    -      .option("numPartitions", "5")
    -      .option("rowsPerSecond", "5")
    -      .load()
    -      .select('value)
    -      .map(r => r.getLong(0) * 2)
    +    val input = ContinuousMemoryStream[Int]
    +    val df = input.toDF().map(_.getInt(0) * 2)
     
    -    testStream(df, useV2Sink = true)(
    -      StartStream(longContinuousTrigger),
    -      AwaitEpoch(0),
    -      Execute(waitForRateSourceTriggers(_, 2)),
    -      IncrementEpoch(),
    -      Execute(waitForRateSourceTriggers(_, 4)),
    -      IncrementEpoch(),
    -      CheckAnswerRowsContains(scala.Range(0, 40, 2).map(Row(_))))
    +    testStream(df)(
    +      AddData(input, 0.to(2): _*),
    +      CheckAnswer(0.to(2).map(_ * 2): _*),
    --- End diff --
    
    @jose-torres 
    Yeah my intention is ensuring Spark operations work same as Scala 
collection methods, but sure enumerating is also OK since we all know about the 
result easily. 
    Are you in favor of enumerating literals we already know instead of 
calculating for all the tests? Or just only this line? Just would like to apply 
the approach consistently.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to