[ 
https://issues.apache.org/jira/browse/SPARK-16968?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15413161#comment-15413161
 ] 

Jie Huang commented on SPARK-16968:
-----------------------------------

Yes. we can have some workaround to solve the problem. E.g., create the table 
before hand, and run export against existing table. 
It really depends on that the user can understand the schema at first. 
And our current workaround is to create the table, and alter the table 
afterwards. Then to run the DataFrameWriter. However, it is really troublesome.

Actually, if we can supply certain options, Spark can make that easy and more 
dynamical. 


> Allow to add additional options when creating a new table in DF's JDBC 
> writer. 
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-16968
>                 URL: https://issues.apache.org/jira/browse/SPARK-16968
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Jie Huang
>            Priority: Minor
>
> We met some problem when trying to export Dataframe to external mysql thru 
> JDBC driver (if the table doesn't exist). In general, Spark will create a new 
> table automatically if it doesn't exist. However it doesn't support to add 
> additional options when creating a new table. 
> For example, we need to set the default "CHARSET=utf-8" in some customer's 
> table. Otherwise, some UTF-8 columns cannot be exported to mysql 
> successfully. Some encoding exception will be thrown and finally break the 
> job.  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to