Github user yongjiaw commented on the pull request:

    https://github.com/apache/spark/pull/9321#issuecomment-167131245
  
    @andrewor14 regarding the security concern, yes, I think inside 
LogPage.scala which is run with the worker process, it's hardcoded to only read 
from the workerDir, and one can already temper with appId and executorId to 
potentially read the log from other apps. But only the file "stdout" and 
"stderr" are allowed to be read.
    I agree it's best to check what files are being read, but it's not so easy 
to access the loggers of the executor from worker process, and one cannot 
assume the worker has the same log4j config as executors even that's probably 
the case most of the time. Do you have some suggestions?
    On the other hand, I cannot see serious issues by allowing user to 
potentially read any files under workerDir, it will display the bytecode if 
reading a jar file but I think the user already have full control access to all 
the files under workerDir, except for maybe other spark app's jar files. So a 
security measure is probably better enforced at the application level, not by 
the file name (even it's still the best practice to check file name).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to