Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/3687#discussion_r21884849
  
    --- Diff: 
streaming/src/test/scala/org/apache/spark/streaming/InputStreamsSuite.scala ---
    @@ -238,14 +190,17 @@ class InputStreamsSuite extends TestSuiteBase with 
BeforeAndAfter {
         val testDir: File = null
         try {
           val testDir = Utils.createTempDir()
    +      // Create a file that exists before the StreamingContext is created
           val existingFile = new File(testDir, "0")
    -      Files.write("0\n", existingFile, Charset.forName("UTF-8"))
    +      Files.write("0\n", existingFile, Charsets.UTF_8)
    +      assert(existingFile.setLastModified(10000))
     
    -      Thread.sleep(1000)
           // Set up the streaming context and input streams
    -      val newConf = conf.clone.set(
    -        "spark.streaming.clock", 
"org.apache.spark.streaming.util.SystemClock")
    --- End diff --
    
    This usage of `SystemClock` was a little tricky to fix.  It looks like the 
reason that this test used `Thread.sleep()` was that `FileInputDStream` relied 
on the filesystem timestamps matching the system clock when determining whether 
files were new.  Instead, we can just manually those files' modification times. 
 I had to spend a bit of time fiddling with the actual constants used in this 
test, so it would be great if someone could take a look to make sure I haven't 
inadvertently broken the test's ability to catch bugs (might be good to 
whiteboard out a reasonable set of constants in terms of batchInterval, window 
size, etc).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to