Hi Abhay
I have extended the work done by Kent Yao for supporting Ranger with SparkSQL
ThriftServer. I managed to integrate with the Ranger master branch and also the
Ranger authorizer code is called, but I always get denied by Ranger authorizer.
Earlier in the logs, I can see the policies getting downloaded and also the
local .json file is getting created. However, when the check privileges call is
called, it seems it is not able to get it. I am not sure if custom class
loaders are interfering. Is it possible to dump the policies when the
authorization code is called?
I have attached part of the logs. Let me know if you can guide me in the right
direction?
Thanks
Bosco
18/07/09 21:58:21 DEBUG RangerPolicyRepository: ==>
RangerPolicyRepository.setAuditEnabledFromCache()
18/07/09 21:58:21 DEBUG RangerPolicyRepository: <==
RangerPolicyRepository.setAuditEnabledFromCache():false
18/07/09 21:58:21 INFO RangerPolicyRepository: DELETE ME:
policyResourceTrie={database=resourceName=database; optIgnoreCase=true;
optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1;
singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0;
wildcardEvaluatorListCount=0; evaluatorListRefCount=0;
wildcardEvaluatorListRefCount=0, udf=resourceName=udf; optIgnoreCase=true;
optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1;
singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0;
wildcardEvaluatorListCount=0; evaluatorListRefCount=0;
wildcardEvaluatorListRefCount=0, hiveservice=resourceName=hiveservice;
optIgnoreCase=true; optWildcard=true; wildcardChars=*?{}\; nodeCount=1;
leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0;
wildcardEvaluatorListCount=0; evaluatorListRefCount=0;
wildcardEvaluatorListRefCount=0, column=resourceName=column;
optIgnoreCase=true; optWildcard=true; wildcardChars=*?{}\; nodeCount=1;
leafNodeCount=1; singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0;
wildcardEvaluatorListCount=0; evaluatorListRefCount=0;
wildcardEvaluatorListRefCount=0, url=resourceName=url; optIgnoreCase=true;
optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1;
singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0;
wildcardEvaluatorListCount=0; evaluatorListRefCount=0;
wildcardEvaluatorListRefCount=0, table=resourceName=table; optIgnoreCase=true;
optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1;
singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0;
wildcardEvaluatorListCount=0; evaluatorListRefCount=0;
wildcardEvaluatorListRefCount=0}
18/07/09 21:58:21 INFO RangerPolicyRepository: DELETE ME:
resourceKeys=[database]
18/07/09 21:58:21 INFO RangerPolicyRepository: DELETE ME:
resourceName=database, trie=resourceName=database; optIgnoreCase=true;
optWildcard=true; wildcardChars=*?{}\; nodeCount=1; leafNodeCount=1;
singleChildNodeCount=0; maxDepth=1; evaluatorListCount=0;
wildcardEvaluatorListCount=0; evaluatorListRefCount=0;
wildcardEvaluatorListRefCount=0
18/07/09 21:58:21 DEBUG RangerResourceTrie: ==>
RangerResourceTrie.getEvaluatorsForResource(default)
18/07/09 21:58:21 DEBUG RangerResourceTrie: <==
RangerResourceTrie.getEvaluatorsForResource(default): evaluatorCount=0
18/07/09 21:58:21 DEBUG RangerPolicyRepository: <==
RangerPolicyRepository.getLikelyMatchPolicyEvaluators(default): evaluatorCount=0
18/07/09 21:58:21 INFO RangerPolicyEngineImpl: DELETE ME: evaluators.size=0,
request.getResource()=RangerResourceImpl={ownerUser={null}
elements={database=default; } }, policyType=0, resource.getAsString()=default
18/07/09 21:58:21 DEBUG RangerPolicyRepository: ==>
RangerPolicyRepository.storeAuditEnabledInCache()
18/07/09 21:58:21 DEBUG RangerPolicyRepository: <==
RangerPolicyRepository.storeAuditEnabledInCache()
18/07/09 21:58:21 DEBUG RangerPolicyEngineImpl: <==
RangerPolicyEngineImpl.evaluatePoliciesNoAudit(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null}
elements={database=default; } }} accessType={_any} user={hive}
userGroups={hadoop } accessTime={Mon Jul 09 21:58:21 UTC 2018}
clientIPAddress={null} forwardedAddresses={} remoteIPAddress={null}
clientType={HIVECLI} action={METADATA OPERATION} requestData={null}
sessionId={d69a2cc0-7654-4dcd-8aaa-6f81b9c9026f} resourceMatchingScope={SELF}
clusterName={brown} context={token:USER={hive} } }, policyType =0):
RangerAccessResult={isAccessDetermined={false} isAllowed={false}
isAuditedDetermined={false} isAudited={false} policyType={0} policyId={-1}
auditPolicyId={-1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
18/07/09 21:58:21 DEBUG usage: [PERF]
RangerPolicyEngine.usage(accessingUser=hive,accessedResource=default,accessType=_any,evaluatedPoliciesCount=0):
0
18/07/09 21:58:21 DEBUG request: [PERF]
RangerPolicyEngine.evaluatePolicies(requestHashCode=a3f1f32_0): 3
18/07/09 21:58:21 DEBUG RangerPolicyEngineImpl: <==
RangerPolicyEngineImpl.evaluatePolicies(RangerAccessRequestImpl={resource={RangerResourceImpl={ownerUser={null}
elements={database=default; } }} accessType={_any} user={hive}
userGroups={hadoop } accessTime={Mon Jul 09 21:58:21 UTC 2018}
clientIPAddress={null} forwardedAddresses={} remoteIPAddress={null}
clientType={HIVECLI} action={METADATA OPERATION} requestData={null}
sessionId={d69a2cc0-7654-4dcd-8aaa-6f81b9c9026f} resourceMatchingScope={SELF}
clusterName={brown} context={token:USER={hive} } }, policyType=0):
RangerAccessResult={isAccessDetermined={false} isAllowed={false}
isAuditedDetermined={false} isAudited={false} policyType={0} policyId={-1}
auditPolicyId={-1} evaluatedPoliciesCount={0} reason={null} additionalInfo={}}
18/07/09 21:58:21 DEBUG request: [PERF]
RangerHiveAuthorizer.filterListCmdObjects(): 19
18/07/09 21:58:21 DEBUG RangerHiveAuthorizer: <== filterListCmdObjects:
count[0], ret[[]]