[ 
https://issues.apache.org/jira/browse/SPARK-8500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15393282#comment-15393282
 ] 

Hyukjin Kwon commented on SPARK-8500:
-------------------------------------

FYI, this is still happening in 2.0.0 and current master. The main problem with 
this is, it seems we can't know what the element type of the array is before 
actually reading and accessing to the array (See 
https://docs.oracle.com/javase/7/docs/api/java/sql/Array.html#getBaseType()).

I did a bit of researches but I could not find a proper way to find a element 
type of the array from {{MetaData}}. This can be easily done if there is any 
way to find the element type so that we can return a complete array type from 
{{JDBCRDD.getCatalystType}}.

> Support for array types in JDBCRDD
> ----------------------------------
>
>                 Key: SPARK-8500
>                 URL: https://issues.apache.org/jira/browse/SPARK-8500
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>         Environment: MacOSX 10.10.3, Postgres 9.3.5, Spark 1.4 hadoop 2.6, 
> Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_40)
> spark-shell --driver-class-path ./postgresql-9.3-1103.jdbc41.jar
>            Reporter: michal pisanko
>
> Loading a table with a text[] column via sqlContext causes an error.
> sqlContext.load("jdbc", Map("url" -> "jdbc:postgresql://localhost/my_db", 
> "dbtable" -> "table"))
> Table has a column:
> my_col              | text[]                      |
> Stacktrace: https://gist.github.com/8b163bf5fdc2aea7dbb6.git
> Same occurs in pyspark shell.
> Loading another table without text array column works allright.
> Possible hint:
> https://github.com/apache/spark/blob/d986fb9a378416248768828e6e6c7405697f9a5a/sql/core/src/main/scala/org/apache/spark/sql/jdbc/JDBCRDD.scala#L57



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to