[ 
https://issues.apache.org/jira/browse/SPARK-19526?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Grover updated SPARK-19526:
--------------------------------
    Summary: Spark should raise an exception when it tries to read a Hive view 
but it doesn't have read access on the corresponding table(s)  (was: Spark 
should rise an exception when it tries to read a Hive view but it doesn't have 
read access on the corresponding table(s))

> Spark should raise an exception when it tries to read a Hive view but it 
> doesn't have read access on the corresponding table(s)
> -------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19526
>                 URL: https://issues.apache.org/jira/browse/SPARK-19526
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.6.4, 2.0.3, 2.2.0, 2.3.0
>            Reporter: Reza Safi
>
> Spark sees a Hive views as a set of hdfs "files". So to read anything from a 
> Hive view, Spark needs access to all of the files that belongs to the 
> table(s) that the view queries them.  In other words a Spark user cannot be 
> granted fine grained permissions at the levels of Hive columns or records.
> Consider that there is a Spark job that contains a SQL query that tries to 
> read a Hive view. Currently the Spark job will finish successfully if the 
> user that runs the Spark job doesn't have proper read access permissions to 
> the tables that the Hive view has been built on top of them. It will just 
> return an empty result set. This can be confusing for the users, since the 
> job will be finishes without any exception or error. 
> Spark should raise an exception like  AccessDenied when it tries to run a 
> Hive view query and its user doesn't have proper permissions to the tables 
> that the Hive view is created on top of them. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to