[
https://issues.apache.org/jira/browse/SPARK-9505?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14696486#comment-14696486
]
Pangjiu commented on SPARK-9505:
--------------------------------
Hi James,
Thanks for reply.
Yes, I am sure my code for query Mysql database is correct. ` is the escape
character for MYSQL column name while [] are escape characters for MSSQL.
Hope to hear from you soon.
Thanks
> DataFrames : Mysql JDBC not support column names with special characters
> ------------------------------------------------------------------------
>
> Key: SPARK-9505
> URL: https://issues.apache.org/jira/browse/SPARK-9505
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, SQL
> Affects Versions: 1.3.0
> Reporter: Pangjiu
> Priority: Blocker
>
> HI all,
> I had above issue on connect to mySQL database through SQLContext. If the
> mySQL table's column name contains special characters like #[ ] %, it throw
> exception : "You have an error in your SQL syntax".
> Below is coding:
> Class.forName("com.mysql.jdbc.Driver").newInstance()
> val url = "jdbc:mysql://localhost:3306/sakila?user=root&password=xxx"
> val driver = "com.mysql.jdbc.Driver"
> val sqlContext = new SQLContext(sc)
> val output = { sqlContext.load("jdbc", Map
> (
> "url" -> url,
> "driver" -> driver,
> "dbtable" -> "(SELECT `ID`, `NAME%`
> FROM `agent`) AS tableA "
> )
> )
> }
> Hope dataframes for sqlContext can support for special characters very soon
> as this become a stopper now.
> Thanks
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]