GitHub user wangxiaojing opened a pull request:

    https://github.com/apache/spark/pull/2790

    [spark-3940][sql]sql Print the error code three times

    IF  wrong sql ,the console print error one times。
    eg:
    spark-sql> show tabless;
    show tabless;
    14/10/13 21:03:48 INFO ParseDriver: Parsing command: show tabless
    NoViableAltException(26@[598:1: ddlStatement : ( createDatabaseStatement | 
switchDatabaseStatement | dropDatabaseStatement | createTableStatement | 
dropTableStatement | truncateTableStatement | alterStatement | descStatement | 
showStatement | metastoreCheck | createViewStatement | dropViewStatement | 
createFunctionStatement | createMacroStatement | createIndexStatement | 
dropIndexStatement | dropFunctionStatement | dropMacroStatement | 
analyzeStatement | lockStatement | unlockStatement | createRoleStatement | 
dropRoleStatement | grantPrivileges | revokePrivileges | showGrants | 
showRoleGrants | grantRole | revokeRole );])
        at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)
        at org.antlr.runtime.DFA.predict(DFA.java:144)
        at 
org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:1962)
        at 
org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:1298)
        at 
org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:938)
        at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:190)
        at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:161)
        at org.apache.spark.sql.hive.HiveQl$.getAst(HiveQl.scala:218)
        at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:226)
        at 
org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:50)
        at 
org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:49)
        at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
        at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
        at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at 
scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
        at 
scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
        at 
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(SparkSQLParser.scala:31)
        at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:130)
        at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:130)
        at 
org.apache.spark.sql.catalyst.SparkSQLParser$$anonfun$org$apache$spark$sql$catalyst$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:184)
        at 
org.apache.spark.sql.catalyst.SparkSQLParser$$anonfun$org$apache$spark$sql$catalyst$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:183)
        at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
        at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
        at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at 
scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
        at 
scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
        at 
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(SparkSQLParser.scala:31)
        at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:221)
        at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:274)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:209)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
    14/10/13 21:03:49 ERROR SparkSQLDriver: Failed in [show tabless]
    org.apache.spark.sql.hive.HiveQl$ParseException: Failed to parse: show 
tabless
        at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:225)
        at 
org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:50)
        at 
org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:49)
        at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
        at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
        at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at 
scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
        at 
scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
        at 
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(SparkSQLParser.scala:31)
        at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:130)
        at org.apache.spark.sql.hive.HiveQl$$anonfun$3.apply(HiveQl.scala:130)
        at 
org.apache.spark.sql.catalyst.SparkSQLParser$$anonfun$org$apache$spark$sql$catalyst$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:184)
        at 
org.apache.spark.sql.catalyst.SparkSQLParser$$anonfun$org$apache$spark$sql$catalyst$SparkSQLParser$$others$1.apply(SparkSQLParser.scala:183)
        at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
        at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$apply$2.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254)
        at 
scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
        at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
        at 
scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Parsers.scala:891)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
        at 
scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
        at 
scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParsers.scala:110)
        at 
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(SparkSQLParser.scala:31)
        at org.apache.spark.sql.hive.HiveQl$.parseSql(HiveQl.scala:221)
        at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:274)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:209)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
    Caused by: org.apache.hadoop.hive.ql.parse.ParseException: line 1:5 cannot 
recognize input near 'show' 'tabless' '<EOF>' in ddl statement
    
        at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:193)
        at 
org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:161)
        at org.apache.spark.sql.hive.HiveQl$.getAst(HiveQl.scala:218)
        at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:226)
        ... 47 more
    Time taken: 4.35 seconds
    14/10/13 21:03:51 INFO CliDriver: Time taken: 4.35 seconds

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/wangxiaojing/spark spark-3940

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/2790.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2790
    
----
commit e2e5c140269cc9271e11ff33ca7f9221f567a89b
Author: wangxiaojing <[email protected]>
Date:   2014-10-14T04:00:36Z

    sql Print the error code three times

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to