[
https://issues.apache.org/jira/browse/SPARK-8386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14947775#comment-14947775
]
Huaxin Gao commented on SPARK-8386:
-----------------------------------
Actually I can also recreate the problem in the other two cases. The reason I
didn't recreate it earlier is that I already fixed the tableExists method in my
code. In tableExists, it checks if table exists using SELECT 1 FROM $table
LIMIT 1. This is not working for all databases. For the database that doesn't
support LIMIT 1, it will return false and the jdbc/insertIntoJDBC will try to
create table again and will get table already exists error. I think this is
what Visha got.
I searched Gira, and there is already a problem opened for the LIMIT 1 in
tableexists, so I will not fix this problem. I will only fix the saveMode
problem in my first comment.
> DataFrame and JDBC regression
> -----------------------------
>
> Key: SPARK-8386
> URL: https://issues.apache.org/jira/browse/SPARK-8386
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.4.0
> Environment: RHEL 7.1
> Reporter: Peter Haumer
> Priority: Critical
>
> I have an ETL app that appends to a JDBC table new results found at each run.
> In 1.3.1 I did this:
> testResultsDF.insertIntoJDBC(CONNECTION_URL, TABLE_NAME, false);
> When I do this now in 1.4 it complains that the "object" 'TABLE_NAME' already
> exists. I get this even if I switch the overwrite to true. I also tried this
> now:
> testResultsDF.write().mode(SaveMode.Append).jdbc(CONNECTION_URL, TABLE_NAME,
> connectionProperties);
> getting the same error. It works running the first time creating the new
> table and adding data successfully. But, running it a second time it (the
> jdbc driver) will tell me that the table already exists. Even
> SaveMode.Overwrite will give me the same error.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]