Hi Aris,
thanks for sharing this issue. I can confirm that value classes
currently don't work, however I can't think of reason why they
shouldn't be supported. I would therefore recommend that you report
this as a bug.

(Btw, value classes also currently aren't definable in the REPL. See
https://issues.apache.org/jira/browse/SPARK-17367)

regards,
--Jakob

On Thu, Sep 1, 2016 at 1:58 PM, Aris <arisofala...@gmail.com> wrote:
> Hello Spark community -
>
> Does Spark 2.0 Datasets *not support* Scala Value classes (basically
> "extends AnyVal" with a bunch of limitations) ?
>
> I am trying to do something like this:
>
> case class FeatureId(value: Int) extends AnyVal
> val seq = Seq(FeatureId(1),FeatureId(2),FeatureId(3))
> import spark.implicits._
> val ds = spark.createDataset(seq)
> ds.count
>
>
> This will compile, but then it will break at runtime with a cryptic error
> about "cannot find int at value". If I remove the "extends AnyVal" part,
> then everything works.
>
> Value classes are a great performance boost / static type checking feature
> in Scala, but are they prohibited in Spark Datasets?
>
> Thanks!
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to