[
https://issues.apache.org/jira/browse/SPARK-8628?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14601012#comment-14601012
]
Santiago M. Mola commented on SPARK-8628:
-----------------------------------------
Here is an example of failure with Spark 1.4.0:
{code}
[1.152] failure: ``union'' expected but identifier OR found
SELECT CASE a+1 WHEN b THEN 111 WHEN c THEN 222 WHEN d THEN 333 WHEN e THEN 444
ELSE 555 END, a-b, a FROM t1 WHERE e+d BETWEEN a+b-10 AND c+130 OR a>b OR d>e
^
java.lang.RuntimeException: [1.152] failure: ``union'' expected but identifier
OR found
SELECT CASE a+1 WHEN b THEN 111 WHEN c THEN 222 WHEN d THEN 333 WHEN e THEN 444
ELSE 555 END, a-b, a FROM t1 WHERE e+d BETWEEN a+b-10 AND c+130 OR a>b OR d>e
^
at scala.sys.package$.error(package.scala:27)
{code}
> Race condition in AbstractSparkSQLParser.parse
> ----------------------------------------------
>
> Key: SPARK-8628
> URL: https://issues.apache.org/jira/browse/SPARK-8628
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.3.0, 1.3.1, 1.4.0
> Reporter: Santiago M. Mola
> Priority: Critical
> Labels: regression
>
> SPARK-5009 introduced the following code:
> def parse(input: String): LogicalPlan = {
> // Initialize the Keywords.
> lexical.initialize(reservedWords)
> phrase(start)(new lexical.Scanner(input)) match {
> case Success(plan, _) => plan
> case failureOrError => sys.error(failureOrError.toString)
> }
> }
> The corresponding initialize method in SqlLexical is not thread-safe:
> /* This is a work around to support the lazy setting */
> def initialize(keywords: Seq[String]): Unit = {
> reserved.clear()
> reserved ++= keywords
> }
> I'm hitting this when parsing multiple SQL queries concurrently. When one
> query parsing starts, it empties the reserved keyword list, then a
> race-condition occurs and other queries fail to parse because they recognize
> keywords as identifiers.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]