[
https://issues.apache.org/jira/browse/SPARK-1649?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14633000#comment-14633000
]
Reynold Xin commented on SPARK-1649:
------------------------------------
[~yhuai] / [~liancheng] does this ticket still apply? If not, please close it.
Thanks.
> Figure out Nullability semantics for Array elements and Map values
> ------------------------------------------------------------------
>
> Key: SPARK-1649
> URL: https://issues.apache.org/jira/browse/SPARK-1649
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 1.1.0
> Reporter: Andre Schumacher
> Priority: Critical
>
> For the underlying storage layer it would simplify things such as schema
> conversions, predicate filter determination and such to record in the data
> type itself whether a column can be nullable. So the DataType type could look
> like like this:
> abstract class DataType(nullable: Boolean = true)
> Concrete subclasses could then override the nullable val. Mostly this could
> be left as the default but when types can be contained in nested types one
> could optimize for, e.g., arrays with elements that are nullable and those
> that are not.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]