In this case I'd probably just store it as a String. Our casting rules (which come from Hive) are such that when you use a string as an number of boolean it will be casted to the desired type.
Thanks for the PR btw :) On Fri, Mar 27, 2015 at 2:31 PM, Eran Medan <ehrann.meh...@gmail.com> wrote: > Hi everyone, > > I had a lot of questions today, sorry if I'm spamming the list, but I > thought it's better than posting all questions in one thread. Let me know > if I should throttle my posts ;) > > Here is my question: > > When I try to have a case class that has Any in it (e.g. I have a > property map and values can be either String, Int or Boolean, and since we > don't have union types, Any is the closest thing) > > When I try to register such an RDD as a table in 1.2.1 (or convert to > DataFrame in 1.3 and then register as a table) > > I get this weird exception: > > Exception in thread "main" scala.MatchError: Any (of class > scala.reflect.internal.Types$ClassNoArgsTypeRef) at > org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:112) > > Which from my interpretaion simply means that Any is not a valid type > that Spark SQL can support in it's schema > > I already sent a pull request <https://github.com/apache/spark/pull/5235> to > solve the cryptic exception but my question is - *is there a way to > support an "Any" type in Spark SQL?* > > disclaimer - also posted at > http://stackoverflow.com/questions/29310405/what-is-the-right-way-to-represent-an-any-type-in-spark-sql >