GitHub user jkbradley opened a pull request:
https://github.com/apache/spark/pull/15017
[SPARK-17456][CORE] Utility for parsing Spark versions
## What changes were proposed in this pull request?
This patch adds methods for extracting major and minor versions as Int
types in Scala from a Spark version string.
Motivation: There are many hacks within Spark's codebase to identify and
compare Spark versions. We should add a simple utility to standardize these
code paths, especially since there have been mistakes made in the past. This
will let us add unit tests as well. Currently, I want this functionality to
check Spark versions to provide backwards compatibility for ML model
persistence.
## How was this patch tested?
Unit tests
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jkbradley/spark version-parsing
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/15017.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #15017
----
commit 249661f2d7acc8beecaede46cfa46c33e68a01e2
Author: Joseph K. Bradley <[email protected]>
Date: 2016-09-08T20:41:50Z
Added VersionUtils to spark core
commit 8978cb33f35b0b2a544c29eb19e8b740637e83d8
Author: Joseph K. Bradley <[email protected]>
Date: 2016-09-08T20:53:19Z
simplification
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]