[ https://issues.apache.org/jira/browse/SPARK-10419?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin updated SPARK-10419: -------------------------------- Summary: Add JDBC dialect for Microsoft SQL Server (was: Add SQLServer JdbcDialect support for datetimeoffset types) > Add JDBC dialect for Microsoft SQL Server > ----------------------------------------- > > Key: SPARK-10419 > URL: https://issues.apache.org/jira/browse/SPARK-10419 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 1.4.1, 1.5.0 > Reporter: Ewan Leith > Priority: Minor > > Running JDBC connections against Microsoft SQL Server database tables, when a > table contains a datetimeoffset column type, the following error is received: > {code} > sqlContext.read.jdbc("jdbc:sqlserver://127.0.0.1:1433;DatabaseName=testdb", > "sampletable", prop) > java.sql.SQLException: Unsupported type -155 > at > org.apache.spark.sql.jdbc.JDBCRDD$.org$apache$spark$sql$jdbc$JDBCRDD$$getCatalystType(JDBCRDD.scala:100) > at > org.apache.spark.sql.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:137) > at > org.apache.spark.sql.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:137) > at scala.Option.getOrElse(Option.scala:120) > at org.apache.spark.sql.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:136) > at > org.apache.spark.sql.jdbc.JDBCRelation.<init>(JDBCRelation.scala:128) > at > org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:200) > at > org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:130) > {code} > Based on the JdbcDialect code for DB2 and the Microsoft SQL Server > documentation, we should probably treat datetimeoffset types as Strings > https://technet.microsoft.com/en-us/library/bb630289%28v=sql.105%29.aspx > We've created a small addition to JdbcDialects.scala to do this conversion, > I'll create a pull request for it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org