Github user aarondav commented on a diff in the pull request:
https://github.com/apache/spark/pull/956#discussion_r13459818
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/SqlParser.scala ---
@@ -41,10 +41,26 @@ import org.apache.spark.sql.catalyst.types._
* for a SQL like language should checkout the HiveQL support in the
sql/hive sub-project.
*/
class SqlParser extends StandardTokenParsers with PackratParsers {
+
def apply(input: String): LogicalPlan = {
- phrase(query)(new lexical.Scanner(input)) match {
- case Success(r, x) => r
- case x => sys.error(x.toString)
+ // Special-case out set commands since the value fields can be
+ // complex to handle without RegexParsers. Also this approach
+ // is clearer for the several possible cases of set commands.
+ if (input.toLowerCase.startsWith("set")) {
+ val kvPair = input.drop(3).split("=")
--- End diff --
Does this handle the case where the val string contains an "="? You might
use `.split("=", 2)`. You could also refactor this to something like:
```scala
input.drop(3).split("=", 2).map(_.trim) match {
case Array() | Array("") => // "set"
SetCommand(None, None)
case Array(key) => // "set key"
SetCommand(Some(key), None)
case Array(key, value) => // "set key=value"
SetCommand(Some(key), Some(value))
}
```
Also, you may want to handle the case where a space precedes the "set", not
sure if that's possible. (I'm sure there's also some extraordinarily beautiful
solution that integrates with the Scala parsers, blinding all who dare to look
upon it.)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---