[ 
https://issues.apache.org/jira/browse/SPARK-8500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15511840#comment-15511840
 ] 

Hyukjin Kwon commented on SPARK-8500:
-------------------------------------

I am leaving a note that PostgreSQL is supporting {{ArrayType}} as a dialect in 
[PostgresDialect.scala|https://github.com/apache/spark/blob/a133057ce5817f834babe9f25023092aec3c321d/sql/core/src/main/scala/org/apache/spark/sql/jdbc/PostgresDialect.scala#L47-L65].

> Support for array types in JDBCRDD
> ----------------------------------
>
>                 Key: SPARK-8500
>                 URL: https://issues.apache.org/jira/browse/SPARK-8500
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>         Environment: MacOSX 10.10.3, Postgres 9.3.5, Spark 1.4 hadoop 2.6, 
> Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_40)
> spark-shell --driver-class-path ./postgresql-9.3-1103.jdbc41.jar
>            Reporter: michal pisanko
>
> Loading a table with a text[] column via sqlContext causes an error.
> sqlContext.load("jdbc", Map("url" -> "jdbc:postgresql://localhost/my_db", 
> "dbtable" -> "table"))
> Table has a column:
> my_col              | text[]                      |
> Stacktrace: https://gist.github.com/8b163bf5fdc2aea7dbb6.git
> Same occurs in pyspark shell.
> Loading another table without text array column works allright.
> Possible hint:
> https://github.com/apache/spark/blob/d986fb9a378416248768828e6e6c7405697f9a5a/sql/core/src/main/scala/org/apache/spark/sql/jdbc/JDBCRDD.scala#L57



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to