Hi All, I am using PostgreSQL database. I am using the following jdbc call to access a customer table (*customer_id int, event text, country text, content xml)* in my database.
*val dataframe1 = sqlContext.load("jdbc", Map("url" -> "jdbc:postgresql://localhost/customerlogs?user=postgres&password=postgres", "dbtable" -> "customer"))* When i run above command in spark-shell i receive the following error. *java.sql.SQLException: Unsupported type 1111* * at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.org$apache$spark$sql$execution$datasources$jdbc$JDBCRDD$$getCatalystType(JDBCRDD.scala:103)* * at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:140)* * at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$$anonfun$1.apply(JDBCRDD.scala:140)* * at scala.Option.getOrElse(Option.scala:120)* * at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:139)* * at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91)* * at org.apache.spark.sql.execution.datasources.jdbc.DefaultSource.createRelation(DefaultSource.scala:60)* * at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)* * at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)* * at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1153)* * at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:25)* * at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)* * at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:32)* * at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)* * at $iwC$$iwC$$iwC$$iwC.<init>(<console>:36)* * at $iwC$$iwC$$iwC.<init>(<console>:38)* * at $iwC$$iwC.<init>(<console>:40)* * at $iwC.<init>(<console>:42)* * at <init>(<console>:44)* * at .<init>(<console>:48)* * at .<clinit>(<console>)* * at .<init>(<console>:7)* * at .<clinit>(<console>)* * at $print(<console>)* * at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)* * at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)* * at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)* * at java.lang.reflect.Method.invoke(Method.java:497)* * at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)* * at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)* * at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)* * at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)* * at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)* * at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)* * at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)* * at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)* * at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)* * at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)* * at org.apache.spark.repl.SparkILoop.org <http://org.apache.spark.repl.SparkILoop.org>$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)* * at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)* * at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)* * at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)* * at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)* * at org.apache.spark.repl.SparkILoop.org <http://org.apache.spark.repl.SparkILoop.org>$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)* * at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)* * at org.apache.spark.repl.Main$.main(Main.scala:31)* * at org.apache.spark.repl.Main.main(Main.scala)* * at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)* * at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)* * at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)* * at java.lang.reflect.Method.invoke(Method.java:497)* * at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)* * at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)* * at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)* * at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)* * at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)* Is xml column type not supported yet in spark ? is there any way to fix this issue ? Thanks, Rajeshwar Gaini.