Github user shivaram commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13660#discussion_r67018458
  
    --- Diff: docs/sparkr.md ---
    @@ -262,6 +262,67 @@ head(df)
     {% endhighlight %}
     </div>
     
    +### Applying User-defined Function
    +
    +#### dapply
    +Apply a function to each partition of `SparkDataFrame`. The function to be 
applied to each partition of the `SparkDataFrame` and should have only one 
parameter, to which a `data.frame` corresponds to each partition will be 
passed. The output of function should be a `data.frame`.
    +<div data-lang="r"  markdown="1">
    +{% highlight r %}
    +
    +# Convert waiting time from hours to seconds.
    +# Note that we can apply UDF to DataFrame.
    +
    +df1 <- dapply(df, function(x) {x}, schema(df))
    +head(collect(df1), 3)
    +##  eruptions waiting waiting_secs
    +##1     3.600      79         4740
    +##2     1.800      54         3240
    +##3     3.333      74         4440
    +
    +{% endhighlight %}
    +</div>
    +
    +#### dapplyCollect
    +Like `dapply`, apply a function to each partition of `SparkDataFrame` and 
collect the result back.
    +<div data-lang="r"  markdown="1">
    +{% highlight r %}
    +
    +# Convert waiting time from hours to seconds.
    +# Note that we can apply UDF to DataFrame.
    +ldf <- dapplyCollect(
    +         df,
    +         function(x) {
    +           x <- cbind(x, "waiting_secs"=x$waiting * 60)
    +         })
    +head(df, 3)
    --- End diff --
    
    Would be good to show how `head(Ldf)` looks here. Also we should note that 
the difference in dapplyCollect is that the schema doesn't need to be passed in 
by the user 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to