[
https://issues.apache.org/jira/browse/SPARK-11623?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15001648#comment-15001648
]
Nguyen Van Nghia edited comment on SPARK-11623 at 11/12/15 3:34 AM:
--------------------------------------------------------------------
I am using Spark to update RDB table in Oracle 11G, cheking the newest spark
source code, I still found that Spark still do not support for Oracle because
no Oracle dialect case class definition exist in JDBCDiallect.scala.
Issue is still opened for my case.
was (Author: [email protected]):
I am using Spark to update RDB table in Oracle 11G, cheking the new spark
source code, I still found that Spark still do not support for Oracle because
no Oracle dialect case class definition exist in JDBCDiallect.scala.
Issue is still opened for my case.
> Sparksql-1.4.1 DataFrameWrite.jdbc() bug
> ----------------------------------------
>
> Key: SPARK-11623
> URL: https://issues.apache.org/jira/browse/SPARK-11623
> Project: Spark
> Issue Type: Bug
> Components: Java API, Spark Submit, SQL
> Affects Versions: 1.4.1, 1.5.1
> Environment: Spark stand alone cluster
> Reporter: Nguyen Van Nghia
>
> I am running spark-submit in window 8.1 with spark standalone cluster (01
> worker and 01 master), the job throw Exception in DataFrameWrite.jdbc(..)
> scala function.
> We found that the following test:
> var tableExists = JdbcUtils.tableExists(conn, table)
> always return false event if we already created a table.
> That drive the function to do creating table from specified DataFrame and
> the SQL Syntax error for creating table, we locate the SQL execution
> statement hereafter:
> if (!tableExists) {
> val schema = JDBCWriteDetails.schemaString(df, url)
> val sql = s"CREATE TABLE $table ($schema)"
> conn.prepareStatement(sql).executeUpdate() // This execution cause
> sql syntax error
> }
> This happened with spark-1.4.1 and Spark-1.5.1 (our dev environment).
> Please help!
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]