Github user junyangq closed the pull request at:
https://github.com/apache/spark/pull/15100
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/15100
Sure, thanks @shivaram
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/15100
[SPARK-17317][SparkR] Add SparkR vignette to branch 2.0
## What changes were proposed in this pull request?
This PR adds SparkR vignette to branch 2.0, which works as a friendly
guidance
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14980
@shivaram Yeah sure :)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r78679227
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -385,22 +385,29 @@ head(result[order(result$max_mpg, decreasing = TRUE),
])
Similar
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14980
Thank you @felixcheung for another pass.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r78594958
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14980
@shivaram Yes, sounds good to me. Do I need to prepare two versions for
that?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14980
@felixcheung we still need to deal with the leftover files in that case?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r77952939
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r77952325
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r77804870
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r77780774
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r77780090
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r9369
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r9018
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r8144
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r7495
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r7073
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r7086
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r5442
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r4636
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r3077
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14980#discussion_r2782
--- Diff: R/pkg/vignettes/sparkr-vignettes.Rmd ---
@@ -0,0 +1,853 @@
+---
+title: "SparkR - Practical Guide"
+output:
+ htm
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14980
cc @shivaram @felixcheung @mengxr
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14980
[SPARK-17317][SparkR] Add SparkR vignette
## What changes were proposed in this pull request?
This PR tries to add a SparkR vignette, which works as a friendly guidance
going through
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14942
[SparkR][Minor] Fix docs for sparkR.session and count
## What changes were proposed in this pull request?
This PR tries to add some more explanation to `sparkR.session`. It also
modifies
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14881#discussion_r77384344
--- Diff: R/pkg/R/mllib.R ---
@@ -1308,3 +1315,104 @@ setMethod("write.ml", signature(object =
"ALSModel", path = "character&q
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/13584
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14881#discussion_r77212162
--- Diff: R/pkg/R/mllib.R ---
@@ -1308,3 +1315,104 @@ setMethod("write.ml", signature(object =
"ALSModel", path = "character&q
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14903
@shivaram I see. Thanks for letting us know.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14903
[SparkR][Minor] Fix windowPartitionBy example
## What changes were proposed in this pull request?
The usage in the original example is incorrect. This PR fixes it.
## How
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14856
Other than that, LGTM.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14881
[SPARK-17315][SparkR] Kolmogorov-Smirnov test SparkR wrapper
## What changes were proposed in this pull request?
This PR tries to add Kolmogorov-Smirnov Test wrapper to SparkR
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14856#discussion_r76844386
--- Diff: R/pkg/inst/tests/testthat/test_mllib.R ---
@@ -99,6 +99,10 @@ test_that("spark.glm summary", {
expect_match(out[2], "Dev
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14856#discussion_r76844264
--- Diff: R/pkg/inst/tests/testthat/test_mllib.R ---
@@ -99,6 +99,10 @@ test_that("spark.glm summary", {
expect_match(out[2], "Dev
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14856#discussion_r76841246
--- Diff: R/pkg/inst/tests/testthat/test_mllib.R ---
@@ -99,6 +99,10 @@ test_that("spark.glm summary", {
expect_match(out[2], "Dev
Github user junyangq closed the pull request at:
https://github.com/apache/spark/pull/14666
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14856#discussion_r76553172
--- Diff: R/pkg/R/mllib.R ---
@@ -171,7 +172,8 @@ predict_internal <- function(object, newData) {
#' @note spark.glm since 2.0.0
#' @seealso \l
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14856#discussion_r76552745
--- Diff: R/pkg/R/mllib.R ---
@@ -171,7 +172,8 @@ predict_internal <- function(object, newData) {
#' @note spark.glm since 2.0.0
#' @seealso \l
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14853
[SparkR][Minor] Fix LDA doc
## What changes were proposed in this pull request?
This PR tries to fix the name of the `SparkDataFrame` used in the example.
Also, it gives a reference url
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14820
The unit test has one already. Do we need this as well?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14820
[SparkR][Minor] Fix example of spark.naiveBayes
## What changes were proposed in this pull request?
The original example doesn't work because the features are not categorical.
This PR
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14818
We may want to use a different name? glmnet related name could be confusing
if it is actually only multiclass logistic.
---
If your project is set up for it, you can reply to this email and have
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/13584
Sounds good. That's also what we meant.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/13584
@shivaram Does it sound reasonable to you? Just discussed this with
@jkbradley.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/13584
@keypointt Can we keep searching (in random or sequential way) until an
unused column name has been found?
---
If your project is set up for it, you can reply to this email and have your
reply
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14792
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14779#discussion_r76098894
--- Diff: R/pkg/R/functions.R ---
@@ -3200,19 +3212,27 @@ setMethod("lag",
#' This is equivalent to the \code{LEAD} funct
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14776#discussion_r76093242
--- Diff: R/pkg/R/DataFrame.R ---
@@ -212,9 +212,9 @@ setMethod("showDF",
#' show
#'
-#' Print the SparkDataFrame column names
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14776#discussion_r76090642
--- Diff: R/pkg/R/DataFrame.R ---
@@ -212,9 +212,9 @@ setMethod("showDF",
#' show
#'
-#' Print the SparkDataFrame column names
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14779
[SparkR][Minor] Add more examples to window function docs
## What changes were proposed in this pull request?
This PR adds more examples to window function docs to make them more
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14776#discussion_r75980038
--- Diff: R/pkg/R/DataFrame.R ---
@@ -212,9 +212,9 @@ setMethod("showDF",
#' show
#'
-#' Print the SparkDataFrame column names
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14776
[SparkR][Minor] Fix doc for show method
## What changes were proposed in this pull request?
The original doc of `show` put methods for multiple classes together but
the text only talks
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14761#discussion_r75970587
--- Diff: R/pkg/R/sparkR.R ---
@@ -550,3 +532,27 @@ processSparkPackages <- function(packages) {
}
splittedPacka
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/13690
Sounds great. Thank you @vectorijk
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14761#discussion_r75902901
--- Diff: R/pkg/R/utils.R ---
@@ -697,3 +697,20 @@ is_master_local <- function(master) {
is_sparkR_shell <- function() {
grepl("
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14767
[SparkR][Minor] Remove Reference Link for the Common Windows Environment
Variables.
## What changes were proposed in this pull request?
The PR removes reference link in the doc
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14761#discussion_r75786649
--- Diff: R/pkg/R/utils.R ---
@@ -697,3 +697,20 @@ is_master_local <- function(master) {
is_sparkR_shell <- function() {
grepl("
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14761#discussion_r75786183
--- Diff: R/pkg/R/utils.R ---
@@ -697,3 +697,20 @@ is_master_local <- function(master) {
is_sparkR_shell <- function() {
grepl("
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14761#discussion_r75783153
--- Diff: R/pkg/R/install.R ---
@@ -125,20 +127,24 @@ robust_download_tar <- function(mirrorUrl, version,
hadoopVersion, packageName,
message(
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/13690
Also, if you need any help with this PR, just let me know and we may work
together to make it.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/13690
ping @vectorijk Have you started working on the random forest wrapper. If
not and feel busy doing that, I can also work on that :)
---
If your project is set up for it, you can reply
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14759
ah never mind, just saw the log. Thanks @shivaram
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14759
LGTM.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14759
I see. Perhaps we can put that in the document somewhere if not yet? Does
Jenkins run check-cran under branch 2.0 or it uses no-test?
---
If your project is set up for it, you can reply
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14759
I see. Thanks for the clarification.
My concern is somewhat unrelated to this update. If we work under master
branch and then run cran-check, it will fail right? Or did I misunderstand
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14759
SGTM. One other thing is we can only run this test under branch-2.0 right?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14761
Thanks @mengxr
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14759
Would it be useful to pick out the specific errors and warnings?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14761
[SparkR][Minor] Add installation message for remote master mode and improve
other messages
## What changes were proposed in this pull request?
This PR tries to give informative message
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75719798
--- Diff: R/pkg/NAMESPACE ---
@@ -1,5 +1,9 @@
# Imports from base R
-importFrom(methods, setGeneric, setMethod, setOldClass)
+# Do not include
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75719536
--- Diff: R/pkg/R/DataFrame.R ---
@@ -3058,7 +3057,7 @@ setMethod("str",
#' @note drop since 2.0.0
setMethod("drop",
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14735
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
GitHub user junyangq opened a pull request:
https://github.com/apache/spark/pull/14743
[SparkR][Minor] Fix Cache Folder Path in Windows
## What changes were proposed in this pull request?
This PR tries to fix the scheme of local cache folder in Windows. The name
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14735#discussion_r75599753
--- Diff: R/pkg/inst/tests/testthat/test_mllib.R ---
@@ -95,6 +95,10 @@ test_that("spark.glm summary", {
expect_equal(stats$df.residu
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14740
LGTM
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14735#discussion_r75598536
--- Diff: R/pkg/R/mllib.R ---
@@ -1163,17 +1145,17 @@ setMethod("spark.als", signature(data =
"SparkDataFrame"),
#' @export
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14735#discussion_r75598531
--- Diff: R/pkg/R/mllib.R ---
@@ -1052,8 +1034,8 @@ setMethod("summary", signature(object =
"GaussianMixtureModel"),
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14735#discussion_r75598520
--- Diff: R/pkg/R/mllib.R ---
@@ -610,8 +616,8 @@ setMethod("summary", signature(object = "KMeansModel"),
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75580853
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2880,7 +2880,7 @@ setMethod("fillna",
#'
#' @param x a SparkDataFrame.
#' @param row.
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75580820
--- Diff: R/pkg/R/DataFrame.R ---
@@ -2756,7 +2756,7 @@ setMethod("summary",
#' @param minNonNulls if specified, drop rows that have
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14734#discussion_r75580749
--- Diff: R/pkg/R/functions.R ---
@@ -3113,7 +3113,7 @@ setMethod("ifelse",
#' N = total number of rows in the partition
#' c
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14705
Thanks @felixcheung!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14705
That makes sense. Perhaps this could be done in another PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14666#discussion_r75513347
--- Diff: R/pkg/R/utils.R ---
@@ -689,3 +689,33 @@ getSparkContext <- function() {
sc <- get(".sparkRjsc", envir = .spar
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14705
Yeah, I totally agree that in terms of usage this is safer. Then the doc
for `...` would be an issue. If we keep to the principle that doc be close to
the function, then `...` would be in generic
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75441096
--- Diff: R/pkg/R/DataFrame.R ---
@@ -932,7 +932,7 @@ setMethod("sample_frac",
#' @param x a SparkDataFrame.
#' @family SparkDataFrame
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14666#discussion_r75440848
--- Diff: R/pkg/R/utils.R ---
@@ -689,3 +689,33 @@ getSparkContext <- function() {
sc <- get(".sparkRjsc", envir = .spar
Github user junyangq commented on the issue:
https://github.com/apache/spark/pull/14384
test this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75437776
--- Diff: R/pkg/R/functions.R ---
@@ -2276,9 +2276,8 @@ setMethod("n_distinct", signature(x = "Column"),
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75437423
--- Diff: R/pkg/R/functions.R ---
@@ -1335,7 +1336,7 @@ setMethod("rtrim",
#' @note sd since 1.6.0
setMethod("sd",
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75437149
--- Diff: R/pkg/R/mllib.R ---
@@ -917,14 +922,14 @@ setMethod("spark.lda", signature(data =
"SparkDataFrame"),
# Returns a summary
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75431046
--- Diff: R/pkg/R/functions.R ---
@@ -319,7 +316,7 @@ setMethod("column",
#'
#' Computes the Pearson Correlation Coefficient for t
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75430253
--- Diff: R/pkg/R/SQLContext.R ---
@@ -727,6 +730,7 @@ dropTempView <- function(viewName) {
#' @param source The name of external data sou
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75430152
--- Diff: R/pkg/R/functions.R ---
@@ -1848,7 +1850,7 @@ setMethod("upper",
#' @note var since 1.6.0
setMethod("var",
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75430048
--- Diff: R/pkg/R/functions.R ---
@@ -3115,6 +3166,11 @@ setMethod("dense_rank",
#'
#' This is equivalent to the LAG funct
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r7543
--- Diff: R/pkg/R/functions.R ---
@@ -3115,6 +3166,11 @@ setMethod("dense_rank",
#'
#' This is equivalent to the LAG funct
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75429772
--- Diff: R/pkg/R/mllib.R ---
@@ -620,11 +625,12 @@ setMethod("predict", signature(object =
"KMeansModel"),
#' predictions on
Github user junyangq commented on a diff in the pull request:
https://github.com/apache/spark/pull/14705#discussion_r75429664
--- Diff: R/pkg/R/DataFrame.R ---
@@ -3187,6 +3221,7 @@ setMethod("histogram",
#' @param x A SparkDataFrame
#' @param url JDBC da
1 - 100 of 270 matches
Mail list logo