Github user gaborgsomogyi commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19893#discussion_r155608281
  
    --- Diff: core/src/test/scala/org/apache/spark/SparkFunSuite.scala ---
    @@ -34,12 +36,53 @@ abstract class SparkFunSuite
       with Logging {
     // scalastyle:on
     
    +  val threadWhiteList = Set(
    +    /**
    +     * Netty related threads.
    +     */
    +    "netty.*",
    +
    +    /**
    +     * A Single-thread singleton EventExecutor inside netty which creates 
such threads.
    +     */
    +    "globalEventExecutor.*",
    +
    +    /**
    +     * Netty creates such threads.
    +     * Checks if a thread is alive periodically and runs a task when a 
thread dies.
    +     */
    +    "threadDeathWatcher.*",
    +
    +    /**
    +     * These threads are created by spark when internal RPC environment 
initialized and later used.
    --- End diff --
    
    TaskManagerSuite.test("TaskSet with no preferences") prints this out:
    
    ===== POSSIBLE THREAD LEAK IN SUITE o.a.s.scheduler.TaskSetManagerSuite, 
thread names: rpc-client-1-1, rpc-server-3-2, rpc-client-1-4, rpc-server-3-6, 
rpc-client-1-3, rpc-server-3-7, rpc-server-3-1, rpc-server-3-8, rpc-server-3-4, 
rpc-client-1-7, rpc-client-1-6, rpc-client-1-2, rpc-server-3-5, 
shuffle-client-4-1, shuffle-server-5-1, rpc-server-3-3, rpc-client-1-5, 
rpc-client-1-8 =====
    
    This is the result even if the test contains only this line:
    
    sc = new SparkContext("local", "test")



---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to