Github user zero323 commented on the pull request:
https://github.com/apache/spark/pull/9099#issuecomment-147920328
@shivaram, @felixcheung
OK, I am puzzled here. I've played with different test scenarios and this
PR is either buggy or fixes unreported bug. Let's say we have following local
data frame:
```r
ldf <- structure(list(foo = list(structure(list(a = 1, b = 3), .Names =
c("a",
"b")), structure(list(a = 2, c = 6), .Names = c("a", "c"))),
bar = c(1, 2), baz = c("a", "b")), .Names = c("foo", "bar",
"baz"), row.names = c("1", "2"), class = "data.frame")
ldf
## foo bar baz
## 1 1, 3 1 a
## 2 2, 6 2 b
str(ldf)
## 'data.frame': 2 obs. of 3 variables:
## $ foo:List of 2
## ..$ :List of 2
## .. ..$ a: num 1
## .. ..$ b: num 3
## ..$ :List of 2
## .. ..$ a: num 2
## .. ..$ c: num 6
## $ bar: num 1 2
## $ baz: chr "a" "b"
```
On 1.5.1 an attempt of converting this to Spark DataFrame fails with
following error:
````r
sdf <- createDataFrame(sqlContext, ldf)
## Error in structField.character(names[[i]], types[[i]], TRUE) :
## Field type must be a string.
```
while patched version creates reasonable schema:
```r
sdf <- createDataFrame(sqlContext, ldf)
printSchema(sdf)
## root
## |-- foo: array (nullable = true)
## | |-- element: double (containsNull = true)
## |-- bar: double (nullable = true)
## |-- baz: string (nullable = true)
```
I believe that patched behavior is what we want here but as far as I can
tell it is neither covered by tests or docs.
I admit input is the most typical R `data.frame` nevertheless I belive it
should be either accepted as a valid input or intentionally marked as invalid.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]