Michał Świtakowski created SPARK-33813:
------------------------------------------
Summary: JDBC datasource fails when reading spatial datatypes with
the MS SQL driver
Key: SPARK-33813
URL: https://issues.apache.org/jira/browse/SPARK-33813
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 3.0.0, 3.1.0
Reporter: Michał Świtakowski
The MS SQL JDBC driver introduced support for spatial types since version 7.0.
The JDBC data source lacks mappings for these types which results in an
exception below. It seems that a mapping in MsSqlServerDialect.getCatalystType
that maps -157 and -158 typecode to VARBINARY should address the issue.
{noformat}
java.sql.SQLException: Unrecognized SQL type -157
at
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getCatalystType(JdbcUtils.scala:251)
at
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.$anonfun$getSchema$1(JdbcUtils.scala:321)
at scala.Option.getOrElse(Option.scala:189)
at
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.getSchema(JdbcUtils.scala:321)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:63)
at
org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:226)
at
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)
at
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:364)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:366)
at
org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:355)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:355)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:240)
at
org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:381){noformat}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]