Github user viirya commented on the issue:

    https://github.com/apache/spark/pull/21211
  
    @HyukjinKwon Thanks! I forgot to change doctests. I will update and address
    your comments once I get access of my laptop.
    
    
    On Wed, May 2, 2018, 1:41 PM Hyukjin Kwon <[email protected]> wrote:
    
    > *@HyukjinKwon* commented on this pull request.
    >
    > Seems fine otherwise but let me leave it to @jkbradley
    > <https://github.com/jkbradley>.
    > ------------------------------
    >
    > In python/pyspark/util.py
    > <https://github.com/apache/spark/pull/21211#discussion_r185397304>:
    >
    > >
    > -    """
    > -    m = re.search('^(\d+)\.(\d+)(\..*)?$', version)
    > -    if m is None:
    > -        return None
    > -    else:
    > -        return (int(m.group(1)), int(m.group(2)))
    > +        """
    > +        m = re.search('^(\d+)\.(\d+)(\..*)?$', version)
    > +        if m is None:
    > +            raise ValueError("invalid version string: " + version)
    >
    > Shall we match the message with Scala side?
    >
    >         throw new IllegalArgumentException(s"Spark tried to parse 
'$sparkVersion' as a Spark" +
    >           s" version string, but it could not find the major and minor 
version numbers.")
    >
    > ------------------------------
    >
    > In python/pyspark/util.py
    > <https://github.com/apache/spark/pull/21211#discussion_r185397331>:
    >
    > >      """
    > -    Get major and minor version numbers for given Spark version string.
    > -
    > -    >>> version = "2.4.0"
    > -    >>> majorMinorVersion(version)
    > -    (2, 4)
    > +    Provides utility method to determine Spark versions with given input 
string.
    > +    """
    > +    @staticmethod
    > +    def majorMinorVersion(version):
    >
    > version -> sparkVersion.
    > ------------------------------
    >
    > In python/pyspark/util.py
    > <https://github.com/apache/spark/pull/21211#discussion_r185397366>:
    >
    > >      """
    > -    Get major and minor version numbers for given Spark version string.
    > -
    > -    >>> version = "2.4.0"
    > -    >>> majorMinorVersion(version)
    > -    (2, 4)
    > +    Provides utility method to determine Spark versions with given input 
string.
    >
    >    * Given a Spark version string, return the (major version number, 
minor version number).
    >    * E.g., for 2.0.1-SNAPSHOT, return (2, 0).
    >
    > ------------------------------
    >
    > In python/pyspark/util.py
    > <https://github.com/apache/spark/pull/21211#discussion_r185397426>:
    >
    > >
    > -    """
    > -    m = re.search('^(\d+)\.(\d+)(\..*)?$', version)
    > -    if m is None:
    > -        return None
    > -    else:
    > -        return (int(m.group(1)), int(m.group(2)))
    > +        """
    > +        m = re.search('^(\d+)\.(\d+)(\..*)?$', version)
    > +        if m is None:
    >
    > I'd do if m is not None to match the order with Scala side.
    >
    > —
    > You are receiving this because you authored the thread.
    > Reply to this email directly, view it on GitHub
    > <https://github.com/apache/spark/pull/21211#pullrequestreview-116775229>,
    > or mute the thread
    > 
<https://github.com/notifications/unsubscribe-auth/AAEM95EttpNHdza3nQplTx476aMF9i1Jks5tuUcPgaJpZM4Tu3oh>
    > .
    >



---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to