Hi,
I am trying to write a new aggregate function 
(https://issues.apache.org/jira/browse/SPARK-17691) and I wanted it to support 
all ordered types.
I have several  issues though:

1.       How to convert the type of the child expression to a Scala standard 
type (e.g. I need an Array[Int] for IntegerType and an Array[Double] for 
DoubleType). The only method I found so far is to do a match for each of the 
types. Is there a better way?

2.       What would be the corresponding scala type for DecimalType, 
TimestampType, DateType and BinaryType? I also couldn't figure out how to do a 
case for DecimalType. Do I need to do a specific case for each of its internal 
types?

3.       Should BinaryType be a legal type for such a function?

4.       I need to serialize the relevant array of type (i.e. turn it into an 
Array[Byte] for working with TypedImperativeAggregate). Currently, I use 
java.io.{ByteArrayOutputStream, ByteArrayInputStream, ObjectInputStream, 
ObjectOutputStream}. Is there another way which is more standard (e.g. get a 
"Serialize" function which knows what to use:  java serialization, kyro 
serialization etc. based on spark configuration?)
Thanks,
                Assaf.





--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Converting-spark-types-and-standard-scala-types-tp19837.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Reply via email to