Github user devaraj-kavali commented on the issue:

    https://github.com/apache/spark/pull/22623
  
    Thanks @srowen  for looking into this.
    
    > ThreadUtils.scala
    
    ```
          case NonFatal(t) if !t.isInstanceOf[TimeoutException] =>
            throw new SparkException("Exception thrown in awaitResult: ", t)
    ```
    
    Here the cause is getting wrapped as SparkException with the message. And 
in SparkSubmit.scala, it is just printing this message and discarding the 
caused exception.
    
    > SparkSubmit.scala
    
    ```
          override def doSubmit(args: Array[String]): Unit = {
            try {
              super.doSubmit(args)
            } catch {
              case e: SparkUserAppException =>
                exitFn(e.exitCode)
              case e: SparkException =>
                printErrorAndExit(e.getMessage())
            }
    ```
    
    
    The other option is to print the whole stack trace instead of just message 
here. Please let me know your thought, I can make change with this.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to