HyukjinKwon commented on a change in pull request #26013: [SPARK-29347][SQL] 
Add JSON serialization for external Rows
URL: https://github.com/apache/spark/pull/26013#discussion_r333285090
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/Row.scala
 ##########
 @@ -501,4 +513,88 @@ trait Row extends Serializable {
   private def getAnyValAs[T <: AnyVal](i: Int): T =
     if (isNullAt(i)) throw new NullPointerException(s"Value at index $i is 
null")
     else getAs[T](i)
+
+  /** The compact JSON representation of this row. */
+  def json: String = compact(jsonValue)
 
 Review comment:
   There's one API case we dropped performance improvement in `Row` as an 
example.
   
   ```scala
     @deprecated("This method is deprecated and will be removed in future 
versions.", "3.0.0")
     def merge(rows: Row*): Row = {
       // TODO: Improve the performance of this if used in performance critical 
part.
       new GenericRow(rows.flatMap(_.toSeq).toArray)
     }
   ```
   
   Do you mind if I ask to add `@Unstable` or `@Private` for these new APIs 
instead just for future improvement in case, with `@since` tag?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to