Github user ssimeonov commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21840#discussion_r204250360
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/Column.scala ---
    @@ -1234,6 +1234,8 @@ class Column(val expr: Expression) extends Logging {
        */
       def over(): Column = over(Window.spec)
     
    +  def copy(field: String, value: Column): Column = 
withExpr(StructCopy(expr, field, value.expr))
    --- End diff --
    
    > Can .cast() be expressed via the .copy() method if the former one will 
support add/delete/update operations?
    
    Are you suggesting `Dataset` adds `copy()` as opposed to `cast()`?
    
    Otherwise, certainly, `Dataset.cast()` can be implemented with 
`Dataset.select()` but it's not easy even in the simple cases and becomes 
noticeably more complicated when you consider correctly setting nullability & 
metadata in the edge cases 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to