This is an automated email from the ASF dual-hosted git repository.

bowenliang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/kyuubi.git


The following commit(s) were added to refs/heads/master by this push:
     new 9e3ac23df [KYUUBI #5192] Make Spark sql lineage plugin compilable on 
Scala 2.13
9e3ac23df is described below

commit 9e3ac23df71302c5b87dab206cdf96f42811035e
Author: liangbowen <[email protected]>
AuthorDate: Wed Aug 23 17:15:31 2023 +0800

    [KYUUBI #5192] Make Spark sql lineage plugin compilable on Scala 2.13
    
    ### _Why are the changes needed?_
    
    - to make Spark SQL lineage plugin compilable on Scala 2.13 with Spark 
3.2/3.3/3.4 (Spark 3.1 does not support Scala 2.13)
    `mvn clean install -DskipTests 
-Pflink-provided,hive-provided,spark-provided -Pscala-2.13 -pl 
:kyuubi-spark-lineage_2.13 -Pspark-3.3`
    - fix type mismatch, by manually converting Iterable to ListMap
    ```
    [ERROR] [Error] 
/Users/bw/dev/incubator-kyuubi/extensions/spark/kyuubi-spark-lineage/src/main/scala/org/apache/kyuubi/plugin/lineage/helper/SparkSQLLineageParseHelper.scala:220:
 type mismatch;
     found   : 
scala.collection.immutable.Iterable[(org.apache.spark.sql.catalyst.expressions.Attribute,
 org.apache.spark.sql.catalyst.expressions.AttributeSet)]
     required: 
LineageParser.this.AttributeMap[org.apache.spark.sql.catalyst.expressions.AttributeSet]
        (which expands to)  
scala.collection.immutable.ListMap[org.apache.spark.sql.catalyst.expressions.Attribute,org.apache.spark.sql.catalyst.expressions.AttributeSet]
    ```
    ### _How was this patch tested?_
    - [ ] Add some test cases that check the changes thoroughly including 
negative and positive cases if possible
    
    - [ ] Add screenshots for manual tests if appropriate
    
    - [x] [Run 
test](https://kyuubi.readthedocs.io/en/master/contributing/code/testing.html#running-tests)
 locally before make a pull request
    
    ### _Was this patch authored or co-authored using generative AI tooling?_
    
    No.
    
    Closes #5192 from bowenliang123/scala213-lineage.
    
    Closes #5192
    
    a68ba8457 [liangbowen] adapt spark lineage plugin to Scala 2.13
    
    Authored-by: liangbowen <[email protected]>
    Signed-off-by: liangbowen <[email protected]>
---
 .../kyuubi/plugin/lineage/helper/SparkSQLLineageParseHelper.scala    | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git 
a/extensions/spark/kyuubi-spark-lineage/src/main/scala/org/apache/kyuubi/plugin/lineage/helper/SparkSQLLineageParseHelper.scala
 
b/extensions/spark/kyuubi-spark-lineage/src/main/scala/org/apache/kyuubi/plugin/lineage/helper/SparkSQLLineageParseHelper.scala
index ab669aa19..6beecba36 100644
--- 
a/extensions/spark/kyuubi-spark-lineage/src/main/scala/org/apache/kyuubi/plugin/lineage/helper/SparkSQLLineageParseHelper.scala
+++ 
b/extensions/spark/kyuubi-spark-lineage/src/main/scala/org/apache/kyuubi/plugin/lineage/helper/SparkSQLLineageParseHelper.scala
@@ -217,10 +217,11 @@ trait LineageParser {
             getField[LogicalPlan](plan, "plan")
           }
 
-        extractColumnsLineage(query, parentColumnsLineage).zipWithIndex.map {
+        val lineages = extractColumnsLineage(query, 
parentColumnsLineage).zipWithIndex.map {
           case ((k, v), i) if outputCols.nonEmpty => 
k.withName(s"$view.${outputCols(i)}") -> v
           case ((k, v), _) => k.withName(s"$view.${k.name}") -> v
-        }
+        }.toSeq
+        ListMap[Attribute, AttributeSet](lineages: _*)
 
       case p if p.nodeName == "CreateDataSourceTableAsSelectCommand" =>
         val table = getV1TableName(getField[CatalogTable](plan, 
"table").qualifiedName)

Reply via email to