Github user liancheng commented on a diff in the pull request:

    https://github.com/apache/spark/pull/1686#discussion_r15681232
  
    --- Diff: 
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLDriver.scala
 ---
    @@ -53,10 +53,9 @@ private[hive] class SparkSQLDriver(val context: 
HiveContext = SparkSQLEnv.hiveCo
       }
     
       override def run(command: String): CommandProcessorResponse = {
    -    val execution = context.executePlan(context.hql(command).logicalPlan)
    -
         // TODO unify the error code
         try {
    +      val execution = context.executePlan(context.hql(command).logicalPlan)
    --- End diff --
    
    (Actually I'd like to add the comment to the `catch` clause of this `try` 
block, but GitHub doesn't allow me to.)
    
    One annoying issue here is that the exception throw is only recorded to the 
logger, not reported to the console. Thus, when using `bin/spark-sql`, users 
can only see the following unintuitive exception:
    
    ```
    $ ./bin/spark-sql
    spark-sql> foo;
    NoViableAltException(26@[])
            at 
org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:902)
            at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:190)
            at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:161)
            ...
    ```
    
    Unless they run `spark-sql --hiveconf hive.root.logger=INFO,console` 
(similar to `shark-withinfo` in Shark):
    
    ```
    $ ./bin/spark-sql --hiveconf hive.root.logger=INFO,console
    spark-sql> foo;
    spark-sql> foo;
    14/08/01 11:17:03 INFO parse.ParseDriver: Parsing command: foo
    NoViableAltException(26@[])
            at 
org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:902)
            at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:190)
            at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:161)
            ...
    14/08/01 11:17:04 ERROR thriftserver.SparkSQLDriver: Failed in [foo]
    org.apache.spark.sql.hive.HiveQl$ParseException: Failed to parse: foo
            at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:214)
            at 
org.apache.spark.sql.hive.HiveContext.hiveql(HiveContext.scala:76)
            at org.apache.spark.sql.hive.HiveContext.hql(HiveContext.scala:79)
            ...
    ```
    
    To fix this, we can print the exception (if any) to console after checking 
response code of the `CommandProcessorResponse` object in 
[`SparkSQLCLIDriver`](https://github.com/apache/spark/blob/8f51491ea78d8e88fc664c2eac3b4ac14226d98f/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala#L291).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to