GitHub user zheh12 opened a pull request:

    https://github.com/apache/spark/pull/21257

    [SPARK-24194] HadoopFsRelation cannot overwrite a path that is also b…

    ## What changes were proposed in this pull request?
    
    When insert overwrite in a parquet table. There will be a error check 
    
    ```
          if (overwrite) DDLUtils.verifyNotReadPath(actualQuery, outputPath)
    ```
    But we can do this for a hive table.
    
    The reason why we can't overwrite a **HadoopFsRelation** with output same 
as input is we delete the output path first. I think we can fix this with 
latter delete, just mark path should be deleted after the job 
    commit.  
    
    ## How was this patch tested?
    
    I change the test code **InsertSuite** and **MetastoreDataSourceSuite**. 
They now are input and output table can be same test. 


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/zheh12/spark SPARK-24194

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/21257.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #21257
    
----

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to