Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/15239#discussion_r82445435
--- Diff: R/pkg/R/SQLContext.R ---
@@ -341,11 +342,13 @@ setMethod("toDF", signature(x = "RDD"),
#' @name read.json
#' @method read.json default
#' @note read.json since 1.6.0
-read.json.default <- function(path) {
+read.json.default <- function(path, ...) {
sparkSession <- getSparkSession()
+ options <- varargsToStrEnv(...)
# Allow the user to have a more flexible definiton of the text file path
paths <- as.list(suppressWarnings(normalizePath(path)))
read <- callJMethod(sparkSession, "read")
+ read <- callJMethod(read, "options", options)
--- End diff --
ah, thank you for the very detailed analysis and tests.
I think generally it would be great to match the scala/python behavior (but
not only because to match it) for read to include all path(s).
```
> read.json(path = "hyukjin.json", path = "felix.json")
Error in dispatchFunc("read.json(path)", x, ...) :
argument "x" is missing, with no default
```
This is because of the parameter hack.
```
> read.df(path = "hyukjin.json", path = "felix.json", source = "json")
Error in f(x, ...) :
formal argument "path" matched by multiple actual arguments
```
Think read.df is unique somewhat in the sense the first parameter is named
`path` - this is both helpful (if we don't want to support multiple path like
this) or bad (user can't specify multiple paths)
```
> varargsToStrEnv("a", 1, 2, 3)
<environment: 0x7f815ba34d18>
```
This case is somewhat dangerous - I think we end by passing a list of
properties without name to the JVM side - it might be a good idea to check for
`zero-length variable name` - perhaps could you open a JIRA on that?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]