Github user sahilTakiar commented on the issue:

    https://github.com/apache/spark/pull/20893
  
    Wrote a test in `SparkLauncherSuite` and was able to replicate the error 
from HIVE-18533, and then realized the exception is only logged and then 
swallowed. From `SparkContext`
    
    ```
        if (_dagScheduler != null) {
          Utils.tryLogNonFatalError {
            _dagScheduler.stop()
          }
          _dagScheduler = null
        }
    ```
    
    Seems the failures I was seeing in HIVE-18533 are due to something else.
    
    Regardless, this is still probably a good fix since you still don't want to 
write to the connection unless its open, but given that the exception is only 
logged and not thrown, don't see an easy way to write a test for this.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to