AngersZhuuuu opened a new pull request #31506:
URL: https://github.com/apache/spark/pull/31506
### What changes were proposed in this pull request?
Add `SHOW COLUMNS` as table-valued function 'show_columns'
In origin, we can't directly analysis column metadata information in SQL,
we need export it out and then analysis on it.
Now, we can just use SQL to do this.
Analysis columns count? check column name with regex like ecth
### Why are the changes needed?
Make it convenient for user to analysis metadata
### Does this PR introduce _any_ user-facing change?
User can use `show_columns` table-valued function such as .
```
sql("CREATE TABLE t(id INT) PARTITIONED BY (part STRING)")
sql("SELECT * from show_columns('t')")
+--------+
|col_name|
+--------+
| id|
| part|
+--------+
```
### How was this patch tested?
added UT
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]