Gianluca Salvo created SPARK-11173:
--------------------------------------
Summary: Cannot save data via MSSQL JDBC
Key: SPARK-11173
URL: https://issues.apache.org/jira/browse/SPARK-11173
Project: Spark
Issue Type: Bug
Components: Java API, PySpark, SQL
Affects Versions: 1.5.1
Environment: Windows 7 sp1 x64, java version "1.8.0_60", Spark 1.5.1,
hadoop 2.6, microsoft jdbc 4.2, pyspark
Reporter: Gianluca Salvo
Hello,
I'm experiencing an issue in writing dataframe via JBDC. My code is
{code:title=Example.python|borderStyle=solid}
from pyspark import SparkContext
from pyspark.sql import SQLContext
import sys
sc=SparkContext(appName="SQL Query")
sqlctx=SQLContext(sc)
serverName="SQLIPAddress"
serverPort="SQL Port"
serverUsername="username"
serverPassword="password"
serverDatabase="database"
#########################################################################
connString="jdbc:sqlserver://{SERVER}:{PORT};user={USER};password={PASSWORD};databasename={DATABASENAME}"
connString=connString.format(SERVER=serverName,PORT=serverPort,USER=serverUsername,PASSWORD=serverPassword,DATABASENAME=serverDatabase)
df=sqlctx.read.format("jdbc").options(url=connString,dbtable="(select * from
TestTable) as test_Table").load()
df.show()
try:
df.write.jdbc(connString,"Test_Target","append")
print("saving completed")
except:
print("Error in saving data",sys.exc_info()[0])
sc.stop()
{code}
Even if i specify *append*, the code throws an exception saying it is trying to
create the table *Test_Target* but the table is already present.
If I target the script to MariaDB, all is fine
{code:title=New Connection string|borderStyle=solid}
connString="jdbc:mysql://{SERVER}:{PORT}/{DATABASENAME}?user={USER}&password={PASSWORD}";
{code}
The problem seems to be the Microsoft JDBC driver. Can you suggest or implement
same workaround?
Best regards
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]