Hi all, By migrating to Spark 2.0.0, one of my program now throws the following runtime exception:
- java.lang.RuntimeException: conversions.ProtoTCConversion$Timestamp is not a valid external type for schema of struct<seconds:bigint,nanos:int> although Timestamp is defined as follows: - case class GoogleTimestamp(seconds: Long, nanos: Int) I create the dataframe using createDataFrame(rdd, schema) where schema holds among other types: StructType(List( StructField("seconds", LongType, false), StructField("nanos", IntegerType, false) )) This code worked fine on 1.6.2 but doesn't on 2.0.0 and I have no idea what is going wrong. Help much appreciated. *Corentin Kerisit* VSRE compliant