GitHub user gatorsmile opened a pull request:

    https://github.com/apache/spark/pull/17319

    [SPARK-19765][SPARK-18549][SPARK-19093][SPARK-19736][BACKPORT-2.1][SQL] 
Backport Three Cache-related PRs to Spark 2.1

    ### What changes were proposed in this pull request?
    
    Backport a few cache related PRs: 
    
    ---
    [[SPARK-19093][SQL] Cached tables are not used in 
SubqueryExpression](https://github.com/apache/spark/pull/16493) 
    
    Consider the plans inside subquery expressions while looking up cache 
manager to make
    use of cached data. Currently CacheManager.useCachedData does not consider 
the
    subquery expressions in the plan.
    
    
    ---
    [[SPARK-19736][SQL] refreshByPath should clear all cached plans with the 
specified path](https://github.com/apache/spark/pull/17064)
    
    Catalog.refreshByPath can refresh the cache entry and the associated 
metadata for all dataframes (if any), that contain the given data source path.
    
    However, CacheManager.invalidateCachedPath doesn't clear all cached plans 
with the specified path. It causes some strange behaviors reported in 
SPARK-15678.
    
    ---
    [[SPARK-19765][SPARK-18549][SQL] UNCACHE TABLE should un-cache all cached 
plans that refer to this table](https://github.com/apache/spark/pull/17097)
    
    When un-cache a table, we should not only remove the cache entry for this 
table, but also un-cache any other cached plans that refer to this table. The 
following commands trigger the table uncache: `DropTableCommand`, 
`TruncateTableCommand`, `AlterTableRenameCommand`, `UncacheTableCommand`, 
`RefreshTable` and `InsertIntoHiveTable`
    
    This PR also includes some refactors:
    - use java.util.LinkedList to store the cache entries, so that it's safer 
to remove elements while iterating
    - rename invalidateCache to recacheByPlan, which is more obvious about what 
it does.
    
    
    
    
    ### How was this patch tested?
    N/A

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/gatorsmile/spark backport-17097

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/17319.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #17319
    
----
commit 3f1895f315ef357ec9f9201748d760293deb4f88
Author: Xiao Li <[email protected]>
Date:   2017-03-16T07:36:35Z

    fix.

commit 11a8f31d5954c14eb8e546d001688f93357da676
Author: Xiao Li <[email protected]>
Date:   2017-03-16T17:35:21Z

    Merge remote-tracking branch 'upstream/branch-2.1' into backport-17097

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to