Github user hellertime commented on the pull request:

    https://github.com/apache/spark/pull/3074#issuecomment-64228513
  
    The slaves will need to pull the image you specify from a Docker Hub (or 
you can pre-pull using the command-line client on each node).
    
    If your image is in the main docker hub than that is what all slaves will 
use.
    
    If you have a base Dockerfile which includes spark, and the appropriate 
libmesos.so, and you add to that your python dependencies you should be good to 
go.
    
    You can even test by running a standalone Spark inside the image to make 
sure the paths are OK.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to