Github user skonto commented on the issue:

    https://github.com/apache/spark/pull/19272
  
    When executor is started it asks from CoarseSchedulerBackend the spark 
config which contains the hadoop credentials 
https://github.com/apache/spark/blob/7c92351f43ac4b1710e3c80c78f7978dad491ed2/core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala#L235.
    As we have discussed this is not safe. Using rpc an arbitrary executor 
could register to the scheduler and get tokens. Does this code handle this, do 
we authenticate executors? How about encryption at least at the rpc level? The 
latter is not supported on mesos (spark.io.encryption.enabled).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to