dtenedor commented on code in PR #53691:
URL: https://github.com/apache/spark/pull/53691#discussion_r2709725828
##########
sql/core/src/test/resources/sql-tests/inputs/pipe-operators.sql:
##########
@@ -401,17 +401,18 @@ table t
|> extend 1 as `x.y.z`
|> drop `x.y.z`;
+-- Dropping a struct field using qualified name.
+table st
+|> drop col.i1;
Review Comment:
Other testing ideas:
* Table alias qualified names as per the PR description: `... |> AS t |>
DROP t.column`
* Multiple qualified columns in a single DROP: `table st |> DROP col.i1,
col.i2`
* Multi-level nested structs: `... |> DROP outer.middle.inner`
* For `SELECT * FROM t |> AS col |> DROP col.x`, is `col.x` interpreted as
dropping field `x` from struct column `col`, or dropping column `x` from table
alias `col`? Let's add a test to cover it
* (Negative test) Invalid qualified names: attempting to drop a non-existent
qualified path: `table st |> DROP col.nonexistent`
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/AstBuilder.scala:
##########
@@ -6659,10 +6659,12 @@ class AstBuilder extends DataTypeAstBuilder
}.getOrElse(Option(ctx.SET).map { _ =>
visitOperatorPipeSet(ctx, left)
}.getOrElse(Option(ctx.DROP).map { _ =>
- val ids: Seq[String] = visitIdentifierSeq(ctx.identifierSeq())
+ val ids: Seq[Seq[String]] =
+ ctx.multipartIdentifierList().multipartIdentifier.asScala.toSeq.map(
+ _.parts.asScala.map(_.getText).toSeq)
Review Comment:
can you use the existing `visitMultipartIdentifier` helper for this? For
example:
```
val ids: Seq[Seq[String]] =
ctx.multipartIdentifierList()
.multipartIdentifier()
.asScala
.toSeq
.map(visitMultipartIdentifier)
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]