[
https://issues.apache.org/jira/browse/SPARK-14767?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15689440#comment-15689440
]
Mikael Ståldal commented on SPARK-14767:
----------------------------------------
I have the same problem in Spark 2.0 when using
[spark-avro|https://github.com/databricks/spark-avro] and trying to convert the
DataFrame it generates to a DataSet with case classes generated with
[Avrohugger|https://github.com/julianpeeters/avrohugger].
First I thought this was an [issue with
spark-avro|https://github.com/databricks/spark-avro/issues/186], but spark-avro
converts avro types to spark-sql-catalyst types, not directly to Scala types.
> Codegen "no constructor found" errors with Maps inside case classes in
> Datasets
> -------------------------------------------------------------------------------
>
> Key: SPARK-14767
> URL: https://issues.apache.org/jira/browse/SPARK-14767
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Burak Yavuz
> Priority: Critical
>
> When I have a `Map` inside a case class and am trying to use Datasets,
> the simplest operation throws an exception, because the generated code is
> looking for a constructor with `scala.collection.Map` whereas the constructor
> takes `scala.collection.immutable.Map`.
> To reproduce:
> {code}
> case class Bug(bug: Map[String, String])
> val ds = Seq(Bug(Map("name" -> "dummy"))).toDS()
> ds.map { b =>
> b.bug.getOrElse("name", null)
> }.count()
> {code}
> Stacktrace:
> {code}
> Caused by: java.util.concurrent.ExecutionException: java.lang.Exception:
> failed to compile: org.codehaus.commons.compiler.CompileException: File
> 'generated.java', Line 163, Column 150: No applicable constructor/method
> found for actual parameters "scala.collection.Map"; candidates are:
> Bug(scala.collection.immutable.Map)"
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]