Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4771#discussion_r25447562
  
    --- Diff: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/HiveThriftServer2.scala
 ---
    @@ -34,6 +36,7 @@ import 
org.apache.spark.scheduler.{SparkListenerApplicationEnd, SparkListener}
      * `HiveThriftServer2` thrift server.
      */
     object HiveThriftServer2 extends Logging {
    +  val SHUTDOWN_HOOK_PRIORITY: Int = 30
    --- End diff --
    
    I'd add a method in Utils.scala to add shutdown hooks, so that you can have 
a single constant and can avoid the ugly `new Runnable` syntax by using a Scala 
closure (`() => Unit`).
    
    Not a big deal, but would make the patch a lot cleaner. :-)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to