[ 
https://issues.apache.org/jira/browse/SPARK-12264?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15052612#comment-15052612
 ] 

Andras Nemeth commented on SPARK-12264:
---------------------------------------

I guess my concrete proposal was a bit hidden in the last sentence of the 
original description:
Let's add a typeTag or scalaTypeTag method to DataType.

It's not that creating a mapping on the user side is terribly hard - although 
there are more complex types like maps and arrays which can be composed 
arbitrarily as far as I can tell, so you do have to do a bit of work to get it 
right.

It's more about this user implemented mapping being very fragile (e.g. I can 
definitely see more system types being added in the future) and duplicated at 
multiple clients.

Getting it at runtime from a concrete row is not great for many reasons:
- It only gives a ClassTag, not a TypeTag
- You may easily end up with a too concrete Class - e.g. maybe in the first 
row, the first element is a one element set, represented by a 
collection.immutable.HashSet$HashSet1. But that's not going to be a good class 
for all elements in the first column.
- It's not nice that you have to look at the actual data to understand what it 
is. What's the point of schemas then?

> Could DataType provide a TypeTag?
> ---------------------------------
>
>                 Key: SPARK-12264
>                 URL: https://issues.apache.org/jira/browse/SPARK-12264
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>            Reporter: Andras Nemeth
>            Priority: Minor
>
> We are writing code that's dealing with generic DataFrames as inputs and 
> further processes their contents with normal RDD operations (not SQL). We 
> need some mechanism that tells us exactly what Scala types we will find 
> inside a Row of a given DataFrame.
> The schema of the DataFrame contains this information in an abstract sense. 
> But we need to map it to TypeTags, as that's what the rest of the system uses 
> to identify what RDD contains what type of data - quite the natural choice in 
> Scala.
> As far as I can tell, there is no good way to do this today. For now we have 
> a hand coded mapping, but that feels very fragile as spark evolves. Is there 
> a better way I'm missing? And if not, could we create one? Adding a typeTag 
> or scalaTypeTag method to DataType, or at least to AtomicType  seems easy 
> enough.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to