Github user nchammas commented on the pull request:

    https://github.com/apache/spark/pull/4901#issuecomment-77396408
  
    I don't think we should duplicate the list of valid Spark versions. 
Instead, it probably makes more sense to have a separate dictionary just for 
Tachyon version mappings and put that towards the beginning of the script as a 
global so it's easy to find and update, since that will be done manually.
    
    Also, let's please reuse the existing code to validate the Spark version 
and avoid method signatures that just take an `opts` blob as an argument. I 
know we have them all over the place in `spark-ec2`, but we should move away 
from that.
    
    For example:
    
    ```python
    SPARK_TACHYON_MAP = {
        "1.0.0": "0.4.1",
        "1.0.1": "0.4.1",
        "1.0.2": "0.4.1",
        "1.1.0": "0.5.0",
        "1.1.1": "0.5.0",
        "1.2.0": "0.5.0",
        "1.2.1": "0.5.0",
    }
    
    def get_tachyon_version(spark_version):
        if spark_version not in SPARK_TACHYON_MAP:
            return ""
        else:
            return SPARK_TACHYON_MAP[spark_version]
    
    spark_v = get_validate_spark_version(opts.spark_version, 
opts.spark_git_repo)
    
    if "." in opts.spark_version:
        tachyon_v = get_tachyon_version(spark_v)
    else:
       ...
    ```



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to