[
https://issues.apache.org/jira/browse/SPARK-1649?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14077150#comment-14077150
]
Robbie Russo commented on SPARK-1649:
-------------------------------------
Just opened https://issues.apache.org/jira/browse/SPARK-2721
> Figure out Nullability semantics for Array elements and Map values
> ------------------------------------------------------------------
>
> Key: SPARK-1649
> URL: https://issues.apache.org/jira/browse/SPARK-1649
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 1.1.0
> Reporter: Andre Schumacher
> Priority: Critical
>
> For the underlying storage layer it would simplify things such as schema
> conversions, predicate filter determination and such to record in the data
> type itself whether a column can be nullable. So the DataType type could look
> like like this:
> abstract class DataType(nullable: Boolean = true)
> Concrete subclasses could then override the nullable val. Mostly this could
> be left as the default but when types can be contained in nested types one
> could optimize for, e.g., arrays with elements that are nullable and those
> that are not.
--
This message was sent by Atlassian JIRA
(v6.2#6252)