zhenlineo commented on code in PR #39712:
URL: https://github.com/apache/spark/pull/39712#discussion_r1088088339
##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/Column.scala:
##########
@@ -80,7 +81,7 @@ class Column private[sql] (private[sql] val expr:
proto.Expression) {
}
}
-object Column {
+private[sql] object Column {
Review Comment:
Thanks for your inputs.
Looking the current
[Column](https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/Column.scala#L35)
class, the SQL API give two public APIs to construct the Column:
```
class Column(val expr: Expression) extends Logging {
def this(name: String) = this(name match {
case "*" => UnresolvedStar(None)
case _ if name.endsWith(".*") =>
val parts = UnresolvedAttribute.parseAttributeName(name.substring(0,
name.length - 2))
UnresolvedStar(Some(parts))
case _ => UnresolvedAttribute.quotedString(name)
})
...
```
Right now the client API is very far from completion. We will add new
methods in coming PRs. I am sure there will be a `Column(name: String)` for
users to use. But it is out the scope of this PR to include all public
constructors needed for the client.
The compatibility check added with this PR will grow the check coverage when
more and more methods are added in the client. The current check ensures when a
new method are added, the new method should be binary compatible with the
existing SQL API. When the client API coverage is up (~80%) we can switch to a
more aggressive check to ensure we did not miss any methods by mistake.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]