Github user xinxin-stripe commented on the issue:
https://github.com/apache/spark/pull/21320
Hello, we've been using your patch at Stripe and we've found something that
looks like a new correctness issue:
```
import spark.implicits._
case class Inner(a: String)
case class Outer(key: String, inner: Inner)
val obj = Outer("key", Inner("a"))
val ds = spark.createDataset(Seq(obj))
.groupByKey(_.key)
.reduceGroups((struct1, struct2) => struct1)
.map(_._2)
ds.collect.head shouldBe obj
// This fails with java.lang.RuntimeException: Couldn't find inner#38 in
[key#37,value#41,a#61]
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]