MrPowers commented on a change in pull request #31073:
URL: https://github.com/apache/spark/pull/31073#discussion_r554457442



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/functions.scala
##########
@@ -2841,6 +2841,31 @@ object functions {
   // DateTime functions
   
//////////////////////////////////////////////////////////////////////////////////////////////
 
+  /**
+   * Creates a datetime interval
+   *
+   * @param years Number of years
+   * @param months Number of months
+   * @param weeks Number of weeks
+   * @param days Number of days
+   * @param hours Number of hours
+   * @param mins Number of mins
+   * @param secs Number of secs
+   * @return A datetime interval
+   * @group datetime_funcs
+   * @since 3.2.0
+   */
+  def make_interval(

Review comment:
       @zero323 - are you OK if I create a separate JIRA and do a separate pull 
request to add this function to PySpark & R?  Seems like @HyukjinKwon is open 
to doing everything in one PR or having separate PRs.  @jaceklaskowski prefers 
one PR for language.  Just let me know how you'd like me to proceed to keep 
this moving forward, thanks!




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to