[ https://issues.apache.org/jira/browse/SPARK-27625?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan reassigned SPARK-27625: ----------------------------------- Assignee: Marco Gaido > ScalaReflection.serializerFor fails for annotated types > ------------------------------------------------------- > > Key: SPARK-27625 > URL: https://issues.apache.org/jira/browse/SPARK-27625 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.3.2, 2.4.2 > Reporter: Patrick Grandjean > Assignee: Marco Gaido > Priority: Major > Fix For: 3.0.0 > > > ScalaRelfection.serializerFor fails for annotated type. Example: > {code:java} > case class Foo( > field1: String, > field2: Option[String] @Bar > ) > val rdd: RDD[Foo] = ... > val ds = rdd.toDS // fails at runtime{code} > The stack trace: > {code:java} > // code placeholder > User class threw exception: scala.MatchError: scala.Option[String] @Bar (of > class scala.reflect.internal.Types$AnnotatedType) > at > org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:483) > at > org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:445) > at > scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56) > at > org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:824) > at > org.apache.spark.sql.catalyst.ScalaReflection$.cleanUpReflectionObjects(ScalaReflection.scala:39) > at > org.apache.spark.sql.catalyst.ScalaReflection$.org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor(ScalaReflection.scala:445) > at ...{code} > I believe that it would be safe to ignore the annotation. -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org