squito commented on a change in pull request #33936:
URL: https://github.com/apache/spark/pull/33936#discussion_r706380995



##########
File path: repl/src/test/scala/org/apache/spark/repl/ReplSuite.scala
##########
@@ -382,4 +391,41 @@ class ReplSuite extends SparkFunSuite with 
BeforeAndAfterAll {
     assertContains(infoLogMessage1, out)
     assertContains(infoLogMessage2, out)
   }
+
+  def runInterpreterAndGetErrors(input: String, readLineDelay: Long): 
List[String] = {
+    import scala.collection.JavaConverters._
+    import org.apache.log4j.{FileAppender, SimpleLayout}
+
+    val tempFile = Files.createTempFile("repl-test", ".log")
+    tempFile.toFile.deleteOnExit()
+
+    val fileAppender = new FileAppender(new SimpleLayout, 
tempFile.toFile.getAbsolutePath)
+    fileAppender.setThreshold(Level.ERROR)
+    LogManager.getRootLogger.addAppender(fileAppender)
+    try {
+      runInterpreter("local", input, readLineDelay)
+    } finally {
+      LogManager.getRootLogger.removeAppender(fileAppender)
+    }
+    Files.readAllLines(tempFile).asScala.toList
+  }
+
+  test("inactivity timeout is triggered") {
+    InactivityTimeout.inactivityTimeoutMs = 500
+    InactivityTimeout.isTest = true

Review comment:
       rather than making these `var`s and setting them like this, can you make 
them confs which you set inside the test?  Then they can be `val`s inside the 
main code.
   
   If you do set them like this, then they have to be reset back to the 
original value in a `finally` or `afterEach` etc.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to