Github user LantaoJin commented on the issue:

    https://github.com/apache/spark/pull/22411
  
    @cloud-fan I refactor and remove the function outputPath in 
```DataWritingCommand```. Besides the unit test you could see, in my local, I 
added below test in ```HiveQuerySuite.scala```:
    ```scala
      test("SPARK-25421 DataWritingCommandExec(hive) should contains 
'OutputPath' metadata") {
        withTable("t") {
          sql("CREATE TABLE t(col_I int)")
          val f = sql("INSERT OVERWRITE TABLE t SELECT 1")
          
assert(SparkPlanInfo.fromSparkPlan(f.queryExecution.sparkPlan).metadata
            .contains("OutputPath"))
        }
      }
    ```
    But since ```HiveQuerySuite``` can not access ```SparkPlanInfo```, after 
test passed in my local, I remove it again.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to