roryqi commented on issue #10198:
URL: https://github.com/apache/gravitino/issues/10198#issuecomment-4035988025

   I will elaborate more if you have interest. Sorry for the poor description
   
   Spark SQL execution process: parse -> analyze -> logic plan optimization -> 
physical plan optimization
   
   Analysis stage will resolve the unresolved relation and then will load the 
table from the Gravitino catalog and throw the forbidden exception now.
   
   Yes, it's a way using injecting check rules. It will throw a forbidden 
exception before checking rules. So we can modify our code, we return a 
specific table and don't throw forbidden exception and check whether exists 
forbidden tables  when checking rules.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to