[
https://issues.apache.org/jira/browse/SPARK-16603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15391566#comment-15391566
]
Liang Ke edited comment on SPARK-16603 at 7/25/16 9:20 AM:
-----------------------------------------------------------
Sorry. I read the spark sourcecode again, find it's a usage error:
right usage is quoted this column name. and query again without error.
> select * from tsp where 'tsp.20_user_addr' <10;
16/07/25 18:05:34 INFO SparkSqlParser: Parsing command: select * from tsp where
'tsp.20_user_addr' <10
16/07/25 18:05:35 INFO HiveMetaStore: 0: create_database:
Database(name:default, description:default database,
locationUri:hdfs://ht-chen-slave1:8020/apps/root/warehouse, parameters:{})
16/07/25 18:05:35 INFO audit: ugi=root ip=unknown-ip-addr
cmd=create_database: Database(name:default, description:default database,
locationUri:hdfs://ht-chen-slave1:8020/apps/root/warehouse, parameters:{})
16/07/25 18:05:35 INFO HiveMetaStore: 0: get_table : db=default tbl=tsp
16/07/25 18:05:35 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table :
db=default tbl=tsp
16/07/25 18:05:35 INFO CatalystSqlParser: Parsing command: int
16/07/25 18:05:35 INFO CatalystSqlParser: Parsing command: string
16/07/25 18:05:35 INFO HiveMetaStore: 0: get_table : db=default tbl=tsp
16/07/25 18:05:35 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table :
db=default tbl=tsp
16/07/25 18:05:35 INFO CatalystSqlParser: Parsing command: int
16/07/25 18:05:35 INFO CatalystSqlParser: Parsing command: string
16/07/25 18:05:36 INFO CodeGenerator: Code generated in 300.833934 ms
Time taken: 1.793 seconds
16/07/25 18:05:36 INFO CliDriver: Time taken: 1.793 seconds
so, [~marymwu] this is not a bug
was (Author: biglobster):
Sorry. I read the spark sourcecode again, find it's a usage error:
right usage is quoted this column name. and query again without error.
> select * from tsp where 'tsp.20_user_addr' <10;
16/07/25 18:05:34 INFO SparkSqlParser: Parsing command: select * from tsp where
'tsp.20_user_addr' <10
16/07/25 18:05:35 INFO HiveMetaStore: 0: create_database:
Database(name:default, description:default database,
locationUri:hdfs://ht-chen-slave1:8020/apps/root/warehouse, parameters:{})
16/07/25 18:05:35 INFO audit: ugi=root ip=unknown-ip-addr
cmd=create_database: Database(name:default, description:default database,
locationUri:hdfs://ht-chen-slave1:8020/apps/root/warehouse, parameters:{})
16/07/25 18:05:35 INFO HiveMetaStore: 0: get_table : db=default tbl=tsp
16/07/25 18:05:35 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table :
db=default tbl=tsp
16/07/25 18:05:35 INFO CatalystSqlParser: Parsing command: int
16/07/25 18:05:35 INFO CatalystSqlParser: Parsing command: string
16/07/25 18:05:35 INFO HiveMetaStore: 0: get_table : db=default tbl=tsp
16/07/25 18:05:35 INFO audit: ugi=root ip=unknown-ip-addr cmd=get_table :
db=default tbl=tsp
16/07/25 18:05:35 INFO CatalystSqlParser: Parsing command: int
16/07/25 18:05:35 INFO CatalystSqlParser: Parsing command: string
16/07/25 18:05:36 INFO CodeGenerator: Code generated in 300.833934 ms
Time taken: 1.793 seconds
16/07/25 18:05:36 INFO CliDriver: Time taken: 1.793 seconds
so, this is not a bug
> Spark2.0 fail in executing the sql statement which field name begins with
> number,like "d.30_day_loss_user" while spark1.6 supports
> ----------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-16603
> URL: https://issues.apache.org/jira/browse/SPARK-16603
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: marymwu
> Priority: Minor
>
> Spark2.0 fail in executing the sql statement which field name begins with
> number,like "d.30_day_loss_user" while spark1.6 supports
> Error: org.apache.spark.sql.catalyst.parser.ParseException: mismatched input
> '.30' expecting
> {')', ','}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]