Thanks, this looks promising. I am trying to do it without a dependency on
Hive and was hoping that the extension hooks could be used to add a filter
transformation to the logical plan. I've seen some other email saying that
in the optimisation hook the logical is expected to stay the same (
https://stackoverflow.com/questions/40235566/transforming-spark-sql-ast-with-extraoptimizations/40273936
)

But I'm still hoping that some other extension hook can be used to add the
filter operation. Does anyone know that?

There is not much documentation on the extension hooks, could not make it
up from the existing documentation.

Regards,
Richard


Op vr 17 aug. 2018 om 15:33 schreef Maximiliano Patricio Méndez <
mmen...@despegar.com>

> Hi,
>
> I've added table level security using spark extensions based on the
> ongoing work proposed for ranger in RANGER-2128. Following the same logic,
> you could mask columns and work on the logical plan, but not filtering or
> skipping rows, as those are not present in these hooks.
>
> The only difficult I found was integrating extensions with pyspark, since
> in python the SparkContext is always created through the constructor and
> not using the scala getOrCreate() method (I've sent an email regarding
> this). But other than that, it works.
>
>
> On Fri, Aug 17, 2018, 03:56 Richard Siebeling <rsiebel...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I'd like to implement some kind of row-level security and am thinking of
>> adding additional filters to the logical plan possibly using the Spark
>> extensions.
>> Would this be feasible, for example using the injectResolutionRule?
>>
>> thanks in advance,
>> Richard
>>
>

Reply via email to