Github user shivaram commented on the pull request:

    https://github.com/apache/spark/pull/134#issuecomment-40913796
  
    Sorry for the delay in looking at this. The refactoring into a new function 
`destroy_cluster` looks good to me. I am not sure I understand the purpose of 
the unit test. If we want to debug AWS consistency issues then those will not 
occur in the unit test ? Am I missing something ?
    
    Regd. dependencies, we actually ship the boto version we use as a part of 
Spark in https://github.com/apache/spark/tree/master/ec2/third_party -- What 
are the licenses of moto and mock ? If they are Apache compatible and not very 
large, we can add them to third_party.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to