[
https://issues.apache.org/jira/browse/SPARK-8288?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-8288:
-----------------------------------
Assignee: (was: Apache Spark)
> ScalaReflection should also try apply methods defined in companion objects
> when inferring schema from a Product type
> --------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-8288
> URL: https://issues.apache.org/jira/browse/SPARK-8288
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 1.4.0
> Reporter: Cheng Lian
>
> This ticket is derived from PARQUET-293 (which actually describes a Spark SQL
> issue).
> My comment on that issue quoted below:
> {quote}
> ... The reason of this exception is that, the Scala code Scrooge generates
> is actually a trait extending {{Product}}:
> {code}
> trait Junk
> extends ThriftStruct
> with scala.Product2[Long, String]
> with java.io.Serializable
> {code}
> while Spark expects a case class, something like:
> {code}
> case class Junk(junkID: Long, junkString: String)
> {code}
> The key difference here is that the latter case class version has a
> constructor whose arguments can be transformed into fields of the DataFrame
> schema. The exception was thrown because Spark can't find such a constructor
> from trait {{Junk}}.
> {quote}
> We can make {{ScalaReflection}} try {{apply}} methods in companion objects,
> so that trait types generated by Scrooge can also be used for Spark SQL
> schema inference.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]