Github user mdagost commented on the pull request:
https://github.com/apache/spark/pull/5244#issuecomment-90066630
Okay. I fixed the IP's mis-spelling and refactored the conditional logic
into a couple of helper functions, which should be more readable now.
@nchammas I wrote this code and submitted the PR before I realized it was a
dupe, so I never tried #4038 .
In terms of testing, I've verified that this works with our AWS setup to
`launch` clusters into private VPC's, and it works with `get-master`, `login`,
and `destroy`. I'm not entirely sure what you want me to do with `coverage`.
This is the output that I get:
```
[ mdagostino: ~/spark_mdagost/ec2 ]$ coverage report spark_ec2.py
Name Stmts Miss Cover
-------------------------------
spark_ec2 677 509 25%
```
Do you actually want me to run the `launch`, `get-master`, `login`, and
`destroy` commands and post the coverage reports for each of them?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]