[ https://issues.apache.org/jira/browse/SPARK-21187?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16471519#comment-16471519 ]
Eric Wohlstadter commented on SPARK-21187: ------------------------------------------ [~bryanc] [~hyukjin.kwon] Hi Bryan, I'm interested in the missing implementation of Map and Interval types in {{org.apache.spark.sql.vectorized.ArrowColumnVector}} Is this something that is planned to be implemented? Is there anything particularly hard about these types or maybe this just was not needed for a particular use-case? I was thinking of taking a stab at it, but I thought there might be a pitfall waiting here that you might steer me away from? My use-case is for plugging in an {{org.apache.arrow.vector.ipc.ArrowStreamReader}} into {{org.apache.spark.sql.sources.v2.reader.DataSourceReader}} > Complete support for remaining Spark data types in Arrow Converters > ------------------------------------------------------------------- > > Key: SPARK-21187 > URL: https://issues.apache.org/jira/browse/SPARK-21187 > Project: Spark > Issue Type: Umbrella > Components: PySpark, SQL > Affects Versions: 2.3.0 > Reporter: Bryan Cutler > Assignee: Bryan Cutler > Priority: Major > > This is to track adding the remaining type support in Arrow Converters. > Currently, only primitive data types are supported. ' > Remaining types: > * -*Date*- > * -*Timestamp*- > * *Complex*: Struct, -Array-, Arrays of Date/Timestamps, Map > * -*Decimal*- > * *Binary* - in pyspark > Some things to do before closing this out: > * -Look to upgrading to Arrow 0.7 for better Decimal support (can now write > values as BigDecimal)- > * -Need to add some user docs- > * -Make sure Python tests are thorough- > * Check into complex type support mentioned in comments by [~leif], should > we support mulit-indexing? -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org