Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/23012#discussion_r232881732
--- Diff: R/pkg/R/sparkR.R ---
@@ -283,6 +283,10 @@ sparkR.session <- function(
enableHiveSupport = TRUE,
...) {
+ if (ut
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/23012#discussion_r232882419
--- Diff: docs/index.md ---
@@ -31,7 +31,8 @@ Spark runs on both Windows and UNIX-like systems (e.g.
Linux, Mac OS). It's easy
locally o
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/23012#discussion_r232882178
--- Diff: docs/index.md ---
@@ -31,7 +31,8 @@ Spark runs on both Windows and UNIX-like systems (e.g.
Linux, Mac OS). It's easy
locally o
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/23012#discussion_r232881594
--- Diff: R/WINDOWS.md ---
@@ -3,7 +3,7 @@
To build SparkR on Windows, the following steps are required
1. Install R (>= 3.1)
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/23012
FYI
This is unused code Iâm going to remove it
https://github.com/apache/spark/blob/master/R/pkg/src-native/string_hash_code.c
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/23012
Also I think the warning should be in .First in general.R
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/23012
I think this should say unsupported (ie could still work) instead of
deprecated
Also the compareVersion should check both major and minor ie 3.4.0
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22939#discussion_r232500194
--- Diff: R/pkg/R/functions.R ---
@@ -2230,6 +2237,32 @@ setMethod("from_json", signature(x = "Column",
schema = &q
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232500065
--- Diff: R/pkg/R/SQLContext.R ---
@@ -172,36 +257,72 @@ getDefaultSqlSource <- function() {
createDataFrame <- function(data, schema
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232499902
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,91 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232499848
--- Diff: R/pkg/tests/fulltests/test_sparkSQL.R ---
@@ -307,6 +307,64 @@ test_that("create DataFrame from RDD", {
unsetH
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232499794
--- Diff: R/pkg/tests/fulltests/test_sparkSQL.R ---
@@ -307,6 +307,64 @@ test_that("create DataFrame from RDD", {
unsetH
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22993#discussion_r232499645
--- Diff: common/unsafe/src/main/java/org/apache/spark/unsafe/Platform.java
---
@@ -67,6 +67,59 @@
unaligned = _unaligned
GitHub user felixcheung opened a pull request:
https://github.com/apache/spark/pull/23007
[SPARK-26010] fix vignette eval with Java 11
## What changes were proposed in this pull request?
changes in vignette only to disable eval
## How was this patch tested
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22993#discussion_r232477875
--- Diff: common/unsafe/src/main/java/org/apache/spark/unsafe/Platform.java
---
@@ -67,6 +67,59 @@
unaligned = _unaligned
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22993
what settings we need to allow `illegal reflective access`
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22989
and catching Error or Throwable..
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22989#discussion_r232477783
--- Diff: scalastyle-config.xml ---
@@ -240,6 +240,18 @@ This file is divided into 3 sections:
]]>
+
+throw
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22977
right, I mean both this and that should be part of the process
"post-release"
---
-
To unsubscribe, e-mai
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232477365
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,55 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232477325
--- Diff: R/pkg/R/SQLContext.R ---
@@ -189,19 +238,67 @@ createDataFrame <- function(data, schema = NULL,
samplingRatio = 1.0,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232477271
--- Diff: R/pkg/R/SQLContext.R ---
@@ -189,19 +238,67 @@ createDataFrame <- function(data, schema = NULL,
samplingRatio = 1.0,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232477257
--- Diff: R/pkg/R/SQLContext.R ---
@@ -189,19 +238,67 @@ createDataFrame <- function(data, schema = NULL,
samplingRatio = 1.0,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232477171
--- Diff: R/pkg/R/SQLContext.R ---
@@ -172,10 +221,10 @@ getDefaultSqlSource <- function() {
createDataFrame <- function(data, schema
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232477155
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,55 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232477131
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala
---
@@ -225,4 +226,25 @@ private[sql] object SQLUtils extends Logging
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22997
btw, please see the page https://spark.apache.org/contributing.html and
particularly "Pull Request" on the format.
---
--
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22997
thx, but I'm not sure about this approach. this step will now cause hadoop
jar to be packaged into the release tarball of hadoop-provided, which is
undoing the point of hadoop-pro
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22977
I think also there is a hive metastore test that downloads spark release
jar?
---
-
To unsubscribe, e-mail
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22967#discussion_r232178323
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR`
commands, or if initiali
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232170936
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,55 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232176721
--- Diff: R/pkg/R/SQLContext.R ---
@@ -189,19 +238,67 @@ createDataFrame <- function(data, schema = NULL,
samplingRatio = 1.0,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232172687
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,55 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232169938
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,55 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232173367
--- Diff: R/pkg/R/SQLContext.R ---
@@ -189,19 +238,67 @@ createDataFrame <- function(data, schema = NULL,
samplingRatio = 1.0,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232173043
--- Diff: R/pkg/R/SQLContext.R ---
@@ -172,10 +221,10 @@ getDefaultSqlSource <- function() {
createDataFrame <- function(data, schema
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232167634
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,55 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232172546
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,55 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232177132
--- Diff: R/pkg/R/SQLContext.R ---
@@ -189,19 +238,67 @@ createDataFrame <- function(data, schema = NULL,
samplingRatio = 1.0,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232167926
--- Diff: sql/core/src/main/scala/org/apache/spark/sql/api/r/SQLUtils.scala
---
@@ -225,4 +226,25 @@ private[sql] object SQLUtils extends Logging
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232176774
--- Diff: R/pkg/R/SQLContext.R ---
@@ -189,19 +238,67 @@ createDataFrame <- function(data, schema = NULL,
samplingRatio = 1.0,
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232171176
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,55 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232167480
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,55 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r232167110
--- Diff: R/pkg/R/SQLContext.R ---
@@ -215,14 +278,16 @@ createDataFrame <- function(data, schema = NULL,
samplingRatio = 1.0,
}
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22939#discussion_r232166370
--- Diff: R/pkg/R/functions.R ---
@@ -2230,6 +2237,32 @@ setMethod("from_json", signature(x = "Column",
schema = &q
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22967#discussion_r231819016
--- Diff: docs/sparkr.md ---
@@ -133,7 +133,7 @@ specifying `--packages` with `spark-submit` or `sparkR`
commands, or if initiali
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22932
Does it have different values for
new native ORC writer, old Hive ORC writer
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22948#discussion_r231598045
--- Diff: dev/appveyor-install-dependencies.ps1 ---
@@ -115,7 +115,7 @@ $env:Path += ";$env:HADOOP_HOME\bin"
Po
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22921#discussion_r231596680
--- Diff: R/pkg/R/functions.R ---
@@ -1663,9 +1692,24 @@ setMethod("toDegrees",
#' @aliases toRadians toRadians,Column-metho
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22921#discussion_r231403827
--- Diff: R/pkg/R/functions.R ---
@@ -319,6 +319,27 @@ setMethod("acos",
column(jc)
})
+#
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22939#discussion_r231403096
--- Diff: R/pkg/R/functions.R ---
@@ -2230,6 +2237,32 @@ setMethod("from_json", signature(x = "Column",
schema = &q
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r231402726
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,30 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r231402235
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,30 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r231402297
--- Diff: R/pkg/R/SQLContext.R ---
@@ -172,15 +196,17 @@ getDefaultSqlSource <- function() {
createDataFrame <- function(data, schema
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r231402063
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,30 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22954#discussion_r231401994
--- Diff: R/pkg/R/SQLContext.R ---
@@ -147,6 +147,30 @@ getDefaultSqlSource <- function() {
l[["spark.sql.sources
Github user felixcheung commented on the issue:
https://github.com/apache/zeppelin/pull/3206
yes, everything is out (except for the announcement)
---
Github user felixcheung commented on the issue:
https://github.com/apache/zeppelin/pull/3217
LGTM
---
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22939#discussion_r231025592
--- Diff: R/pkg/R/functions.R ---
@@ -205,11 +205,18 @@ NULL
#' also supported for the schema.
#' \item \cod
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22939#discussion_r231025282
--- Diff: R/pkg/R/functions.R ---
@@ -2230,6 +2237,32 @@ setMethod("from_json", signature(x = "Column",
schema = &q
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22921#discussion_r231024007
--- Diff: R/pkg/R/functions.R ---
@@ -1641,30 +1641,30 @@ setMethod("tanh",
})
#' @details
-#' \code{
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22921#discussion_r231023768
--- Diff: R/pkg/R/generics.R ---
@@ -748,7 +748,7 @@ setGeneric("add_months", function(y, x) {
standardGeneric("add_months") })
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22939#discussion_r230649513
--- Diff: R/pkg/R/functions.R ---
@@ -205,11 +205,18 @@ NULL
#' also supported for the schema.
#' \item \cod
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22939#discussion_r230650176
--- Diff: R/pkg/R/functions.R ---
@@ -2230,6 +2237,32 @@ setMethod("from_json", signature(x = "Column",
schema = &q
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22939#discussion_r230649120
--- Diff: R/pkg/R/functions.R ---
@@ -2230,6 +2237,32 @@ setMethod("from_json", signature(x = "Column",
schema = &q
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22939#discussion_r230649693
--- Diff: R/pkg/R/functions.R ---
@@ -2260,6 +2293,32 @@ setMethod("from_csv", signature(x = "Column", schema
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22683
Jenkins, ok to test
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22921#discussion_r230568079
--- Diff: R/pkg/R/functions.R ---
@@ -1641,30 +1641,30 @@ setMethod("tanh",
})
#' @details
-#' \code{
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22921#discussion_r230568058
--- Diff: R/pkg/R/generics.R ---
@@ -748,7 +748,7 @@ setGeneric("add_months", function(y, x) {
standardGeneric("add_months") })
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22921#discussion_r230568088
--- Diff: R/pkg/R/functions.R ---
@@ -319,23 +319,23 @@ setMethod("acos",
})
#' @details
-#' \code
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22864
Jenkins, retest this please
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
GitHub user felixcheung opened a pull request:
https://github.com/apache/spark/pull/22866
[SPARK-12172][SPARKR] Remove internal-only RDD methods
## What changes were proposed in this pull request?
Remove non-public internal only methods for RDD in SparkR
## How was
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22863
please close this PR. thx
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Repository: spark
Updated Branches:
refs/heads/branch-2.4 313a1f0a7 -> f575616db
[SPARK-25859][ML] add scala/java/python example and doc for PrefixSpan
## What changes were proposed in this pull request?
add scala/java/python example and doc for PrefixSpan in branch 2.4
## How was this patch
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22863
merged to 2.4
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22863#discussion_r228727912
--- Diff:
examples/src/main/scala/org/apache/spark/examples/ml/PrefixSpanExample.scala ---
@@ -0,0 +1,62 @@
+/*
+ * Licensed to the Apache
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22865
ok to test
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
ile
registerTempTable
saveAsParquetFile
unionAll
createExternalTable
dropTempTable
## How was this patch tested?
jenkins
Author: Felix Cheung
Closes #22843 from felixcheung/rrddapi.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/com
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22843
merged to master
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22843#discussion_r228701261
--- Diff: docs/sparkr.md ---
@@ -717,4 +717,5 @@ You can inspect the search path in R with
[`search()`](https://stat.ethz.ch/R-ma
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22843#discussion_r228701032
--- Diff: R/pkg/tests/fulltests/test_sparkSQL.R ---
@@ -3477,39 +3447,6 @@ test_that("Window functions on a DataFrame", {
expect_eq
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22843#discussion_r228701116
--- Diff: R/pkg/tests/fulltests/test_sparkSQL.R ---
@@ -3477,39 +3447,6 @@ test_that("Window functions on a DataFrame", {
expect_eq
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/21710
Sure
From: Huaxin Gao
Sent: Friday, October 26, 2018 1:16:38 PM
To: apache/spark
Cc: Felix Cheung; Mention
Subject: Re: [apache
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/21588
Sounds like we should try this then
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
GitHub user felixcheung opened a pull request:
https://github.com/apache/spark/pull/22843
[SPARK-16693][SPARKR] Remove methods deprecated
## What changes were proposed in this pull request?
Remove deprecated functions which includes:
SQLContext/HiveContext stuff
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/21588
does Apache Hive 2.3.2 have all the fixes we need?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22815
Because it was in L457 - multiple function doc blob merged together and
CRAN checks for the final version for missing param
Thx for catching
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22821
Please ping me on the R removal.
---
-
To unsubscribe, e-mail: reviews
Repository: spark
Updated Branches:
refs/heads/master 19ada15d1 -> ddd1b1e8a
[SPARK-24572][SPARKR] "eager execution" for R shell, IDE
## What changes were proposed in this pull request?
Check the `spark.sql.repl.eagerEval.enabled` configuration property in
SparkDataFrame `show()` method. If
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22455
LGTM. merged to master
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22810
merged to master
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
Repository: spark
Updated Branches:
refs/heads/master b2e325625 -> 19ada15d1
[SPARK-24516][K8S] Change Python default to Python3
## What changes were proposed in this pull request?
As this is targeted for 3.0.0 and Python2 will be deprecated by Jan 1st, 2020,
I feel it is appropriate to chan
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22815#discussion_r228045893
--- Diff: R/pkg/R/SQLContext.R ---
@@ -343,7 +343,6 @@ setMethod("toDF", signature(x = "RDD"),
#' path <- &q
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/22668
ouch sorry
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22801#discussion_r227466613
--- Diff: examples/src/main/python/sql/datasource.py ---
@@ -57,6 +57,15 @@ def basic_datasource_example(spark):
format
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22801#discussion_r227466415
--- Diff: examples/src/main/r/RSparkSQLExample.R ---
@@ -118,6 +118,10 @@ df <-
read.df("examples/src/main/resources/people.csv"
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/22791#discussion_r227465232
--- Diff: examples/src/main/r/RSparkSQLExample.R ---
@@ -114,7 +114,7 @@ write.df(namesAndAges, "namesAndAges.parquet",
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/20761#discussion_r227462444
--- Diff: docs/running-on-yarn.md ---
@@ -121,6 +121,43 @@ To use a custom metrics.properties for the application
master and executors, upd
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/21306
where are we on this?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user felixcheung commented on the issue:
https://github.com/apache/spark/pull/21710
merged to master
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews
101 - 200 of 6650 matches
Mail list logo