Github user daniel-shields commented on the issue:
https://github.com/apache/spark/pull/21449
@mgaido91 I looked at the test failures and I think the changes to the
Dataset,resolve method are causing havoc. Consider the Dataset.drop method with
the following signature:
` def drop(col: Column): DataFrame`
There is a statement that may be comparing an AttributeReference with the
new metadata to one without it:
```
val colsAfterDrop = attrs.filter { attr =>
attr != expression
}.map(attr => Column(attr))```
This may be resulting in columns not getting dropped. I haven't verified
but this is the first thing I would check. The change to resolve may be too
drastic.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]