Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/6994#issuecomment-119113167
Oh, hm, in this change we updated to 3.4.1:
https://github.com/apache/spark/pull/5380/files
... but the Hadoop profiles override this back to 3.1.1. I think this was
just never updated from the original change at
https://github.com/apache/spark/commit/d8176b1c2f22247ee724041aefa1af9118cf861d
Oops. I can follow on SPARK-6731 to actually update Commons Math. I suppose
that means we have not been hitting any compatibility given that Spark seems to
actually be working with 3.1.1. Now we need 3.4.1 for real and will actually be
testing against it though.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]