ulysses-you commented on pull request #29034:
URL: https://github.com/apache/spark/pull/29034#issuecomment-656113225
It's ok to add an attribute `isCached` but it is not enough. 2 reasons:
1. A database can contains hundreds tables, `spark.sql("show
tables").where("isCached= true").show` can not used in sql mode. Then it's
hard to find all cached tables in all tables.
2. `SHOW TABLES` in `SparkSQLDriver` only show the table name without other
attribute which is for compatible with Hive. Then the new attribute is useless
in this scene.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]