[
https://issues.apache.org/jira/browse/SOLR-14020?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16992821#comment-16992821
]
Chris M. Hostetter commented on SOLR-14020:
-------------------------------------------
Do we know why there are still a lot of tests using MiniDFSCluster that fail on
jenkins (and reproduce locally for me on linux) with Security Exceptions
related to {{org.apache.hadoop.util.Shell}} ... even though
SolrSecurityManager seems to be explicitly allowing all of these code paths?
The stacktraces for these AccessControlExceptions don't even seem to indicate
that SolrSecurityManager is involved in the call stack?
{code:java}
/**
* {@inheritDoc}
* <p>This method implements hacks to workaround hadoop's garbage Shell and
FileUtil code
*/
@Override
public void checkExec(String cmd) {
// NOTE: it would be tempting to just allow anything from hadoop's Shell
class, but then
// that would just give an easy vector for RCE (use hadoop Shell instead of
e.g. ProcessBuilder)
// so we whitelist actual caller impl methods instead.
for (StackTraceElement element : Thread.currentThread().getStackTrace()) {
// hadoop insists on shelling out to get the user's supplementary groups?
if
("org.apache.hadoop.security.ShellBasedUnixGroupsMapping".equals(element.getClassName())
&&
"getGroups".equals(element.getMethodName())) {
return;
}
// hadoop insists on shelling out to parse 'df' command instead of using
FileStore?
if ("org.apache.hadoop.fs.DF".equals(element.getClassName()) &&
"getFilesystem".equals(element.getMethodName())) {
return;
}
...
{code}
[http://fucit.org/solr-jenkins-reports/job-data/apache/Lucene-Solr-Tests-master/3939]
{noformat}
[junit4] 2> NOTE: reproduce with: ant test
-Dtestcase=TestSolrCloudWithSecureImpersonation -Dtests.seed=3B693D41330FAB89
-Dtests.multiplier=2 -Dtests.slow=true -Dtests.locale=os-GE
-Dtests.timezone=America/Antigua -Dtests.asserts=true
-Dtests.file.encoding=US-ASCII
[junit4] ERROR 0.00s J0 | TestSolrCloudWithSecureImpersonation (suite) <<<
[junit4] > Throwable #1:
com.google.common.util.concurrent.UncheckedExecutionException:
java.security.AccessControlException: access denied ("java.io.FilePermission"
"<<ALL FILES>>" "execute")
...
[junit4] > Caused by: java.security.AccessControlException: access denied
("java.io.FilePermission" "<<ALL FILES>>" "execute")
[junit4] > at
java.base/java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
[junit4] > at
java.base/java.security.AccessController.checkPermission(AccessController.java:897)
[junit4] > at
java.base/java.lang.SecurityManager.checkPermission(SecurityManager.java:322)
[junit4] > at
java.base/java.lang.SecurityManager.checkExec(SecurityManager.java:572)
[junit4] > at
java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1096)
[junit4] > at
java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1071)
[junit4] > at
org.apache.hadoop.util.Shell.runCommand(Shell.java:938)
[junit4] > at org.apache.hadoop.util.Shell.run(Shell.java:901)
[junit4] > at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1213)
[junit4] > at
org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:200)
...
[junit4] 2> NOTE: reproduce with: ant test -Dtestcase=CheckHdfsIndexTest
-Dtests.seed=3B693D41330FAB89 -Dtests.multiplier=2 -Dtests.slow=true
-Dtests.locale=en-SI -Dtests.timezone=America/Panama -Dtests.asserts=true
-Dtests.file.encoding=US-ASCII
[junit4] ERROR 0.00s J1 | CheckHdfsIndexTest (suite) <<<
[junit4] > Throwable #1: java.security.AccessControlException: access
denied ("java.io.FilePermission" "<<ALL FILES>>" "execute")
[junit4] > at
__randomizedtesting.SeedInfo.seed([3B693D41330FAB89]:0)
[junit4] > at
java.base/java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
[junit4] > at
java.base/java.security.AccessController.checkPermission(AccessController.java:897)
[junit4] > at
java.base/java.lang.SecurityManager.checkPermission(SecurityManager.java:322)
[junit4] > at
java.base/java.lang.SecurityManager.checkExec(SecurityManager.java:572)
[junit4] > at
java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1096)
[junit4] > at
java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1071)
[junit4] > at
org.apache.hadoop.util.Shell.runCommand(Shell.java:938)
[junit4] > at org.apache.hadoop.util.Shell.run(Shell.java:901)
[junit4] > at org.apache.hadoop.fs.DF.getFilesystem(DF.java:74)
...
{noformat}
[http://fucit.org/solr-jenkins-reports/job-data/thetaphi/Lucene-Solr-master-Linux/25129]
{noformat}
[junit4] 2> NOTE: reproduce with: ant test -Dtestcase=TestRecoveryHdfs
-Dtests.seed=41F301825C91CC61 -Dtests.multiplier=3 -Dtests.slow=true
-Dtests.locale=rw -Dtests.timezone=Asia/Bangkok -Dtests.asserts=true
-Dtests.file.encoding=US-ASCII
[junit4] ERROR 0.00s J1 | TestRecoveryHdfs (suite) <<<
[junit4] > Throwable #1: java.security.AccessControlException: access
denied ("java.io.FilePermission" "<<ALL FILES>>" "execute")
[junit4] > at
__randomizedtesting.SeedInfo.seed([41F301825C91CC61]:0)
[junit4] > at
java.base/java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
[junit4] > at
java.base/java.security.AccessController.checkPermission(AccessController.java:897)
[junit4] > at
java.base/java.lang.SecurityManager.checkPermission(SecurityManager.java:322)
[junit4] > at
java.base/java.lang.SecurityManager.checkExec(SecurityManager.java:572)
[junit4] > at
java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1096)
[junit4] > at
java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1071)
[junit4] > at
org.apache.hadoop.util.Shell.runCommand(Shell.java:938)
[junit4] > at org.apache.hadoop.util.Shell.run(Shell.java:901)
[junit4] > at org.apache.hadoop.fs.DF.getFilesystem(DF.java:74)
[junit4] > at
org.apache.hadoop.hdfs.server.namenode.NameNodeResourceChecker$CheckedVolume.<init>(NameNodeResourceChecker.java:70)
[junit4] > at
org.apache.hadoop.hdfs.server.namenode.NameNodeResourceChecker.addDirToCheck(NameNodeResourceChecker.java:166)
[junit4] > at
org.apache.hadoop.hdfs.server.namenode.NameNodeResourceChecker.<init>(NameNodeResourceChecker.java:135)
[junit4] > at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startCommonServices(FSNamesystem.java:1168)
[junit4] > at
org.apache.hadoop.hdfs.server.namenode.NameNode.startCommonServices(NameNode.java:791)
[junit4] > at
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:717)
[junit4] > at
org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:940)
[junit4] > at
org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:913)
[junit4] > at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1646)
[junit4] > at
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1314)
[junit4] > at
org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1083)
[junit4] > at
org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:958)
[junit4] > at
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:890)
[junit4] > at
org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:518)
[junit4] > at
org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:477)
[junit4] > at
org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:134)
[junit4] > at
org.apache.solr.cloud.hdfs.HdfsTestUtil.setupClass(HdfsTestUtil.java:66)
[junit4] > at
org.apache.solr.search.TestRecoveryHdfs.beforeClass(TestRecoveryHdfs.java:78)
...
{noformat}
> move hadoop hacks out of lucene TestSecurityManager into a solr one
> -------------------------------------------------------------------
>
> Key: SOLR-14020
> URL: https://issues.apache.org/jira/browse/SOLR-14020
> Project: Solr
> Issue Type: Improvement
> Security Level: Public(Default Security Level. Issues are Public)
> Reporter: Robert Muir
> Priority: Major
> Attachments: SOLR-14020.patch
>
>
> The hadoop hacks here have a heavy toll: because we have to override some
> methods like checkRead, it inserts non-jdk stack frame and breaks a lot of
> JDK doPriv calls. So for example, it means permissions must be added to stuff
> like /dev/random or windows fonts or all kinds of other nonsense.
> This is a price only solr should pay, not lucene. Lets split the stuff out.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]