AngersZhuuuu opened a new pull request, #5627:
URL: https://github.com/apache/kyuubi/pull/5627
### _Why are the changes needed?_
To close #5594
For case
```
def filter_func(iterator):
for pdf in iterator:
yield pdf[pdf.id == 1]
df = spark.read.table("test_mapinpandas")
execute_result = df.mapInPandas(filter_func, df.schema).show()
```
The logical plan is
```
GlobalLimit 21
+- LocalLimit 21
+- Project [cast(id#5 as string) AS id#11, name#6]
+- MapInPandas filter_func(id#0, name#1), [id#5, name#6]
+- HiveTableRelation [`default`.`test_mapinpandas`,
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, Data Cols: [id#0, name#1],
Partition Cols: []]
```
When handle `MapInPandas`, we didn't match its input with
`HiveTableRelation`, cause we miss input table's columns. This pr fix this
### _How was this patch tested?_
- [x] Add some test cases that check the changes thoroughly including
negative and positive cases if possible
- [ ] Add screenshots for manual tests if appropriate
- [ ] [Run
test](https://kyuubi.readthedocs.io/en/master/contributing/code/testing.html#running-tests)
locally before make a pull request
With this patch, it can extract the correct privilege object now
<img width="750" alt="截屏2023-11-02 上午10 25 38"
src="https://github.com/apache/kyuubi/assets/46485123/82e9a91d-130c-456e-9069-5b7d5046d940">
### _Was this patch authored or co-authored using generative AI tooling?_
<!--
If a generative AI tooling has been used in the process of authoring this
patch, please include
phrase 'Generated-by: ' followed by the name of the tool and its version.
If no, write 'No'.
Please refer to the [ASF Generative Tooling
Guidance](https://www.apache.org/legal/generative-tooling.html) for details.
-->
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]