Does this mean there is a possible mismatch of jdbc driver with oracle?

From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: Friday, July 17, 2015 2:09 PM
To: Sambit Tripathy (RBEI/EDS1)
Cc: user@spark.apache.org
Subject: Re: What is "java.sql.SQLException: Unsupported type -101"?

Looking at getCatalystType():
   * Maps a JDBC type to a Catalyst type.  This function is called only when
   * the JdbcDialect class corresponding to your database driver returns null.

sqlType was carrying value of -101

However, I couldn't find -101 in 
http://grepcode.com/file/repository.grepcode.com/java/root/jdk/openjdk/6-b14/java/sql/Types.java

FYI

On Fri, Jul 17, 2015 at 2:01 PM, Sambit Tripathy (RBEI/EDS1) 
<sambit.tripa...@in.bosch.com<mailto:sambit.tripa...@in.bosch.com>> wrote:
Hi,

I was trying to get a oracle table using JDBC datasource

val jdbcDF = sqlContext.load("jdbc", Map( "url" -> 
"jdbc:oracle:thin:USER/p...@host.com:1517:sid", "dbtable" -> 
"USER.TABLE","driver" -> "oracle.jdbc.OracleDriver"))

and got the error below

java.sql.SQLException: Unsupported type -101
        at org.apache.spark.sql.jdbc.JDBCRDD$.getCatalystType(JDBCRDD.scala:78)
        at org.apache.spark.sql.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:112)
        at org.apache.spark.sql.jdbc.JDBCRelation.<init>(JDBCRelation.scala:133)
        at 
org.apache.spark.sql.jdbc.DefaultSource.createRelation(JDBCRelation.scala:121)
        at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:219)
        at org.apache.spark.sql.SQLContext.load(SQLContext.scala:697)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:23)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:28)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:32)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
        at $iwC$$iwC$$iwC.<init>(<console>:36)
        at $iwC$$iwC.<init>(<console>:38)
        at $iwC.<init>(<console>:40)
        at <init>(<console>:42)
        at .<init>(<console>:46)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)


Any idea what it could be?


Regards,
Sambit.


Reply via email to