Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/17185#discussion_r207759811
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/package.scala
---
@@ -169,25 +181,50 @@ package object expressions {
})
}
- // Find matches for the given name assuming that the 1st part is a
qualifier (i.e. table name,
- // alias, or subquery alias) and the 2nd part is the actual name.
This returns a tuple of
+ // Find matches for the given name assuming that the 1st two parts
are qualifier
+ // (i.e. database name and table name) and the 3rd part is the
actual column name.
+ //
+ // For example, consider an example where "db1" is the database
name, "a" is the table name
+ // and "b" is the column name and "c" is the struct field name.
+ // If the name parts is db1.a.b.c, then Attribute will match
--- End diff --
What I'm talking about is ambiguity. `col.innerField1.innerField2` can fail
if the `innerField2` doesn't exist. My question is: shall we try all the
possible resolution paths and pick the valid one, or define a rule that we can
decide the resolution path ahead.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]