GitHub user HyukjinKwon opened a pull request:
https://github.com/apache/spark/pull/15354
[SPARK-17764][SQL] Add `to_json` supporting to convert nested struct column
to JSON string
## What changes were proposed in this pull request?
This PR proposes to add `to_json` function in contrast with `from_json` in
Scala, Java and Python.
It'd be useful if we can convert a same column from/to json. Also, some
datasources do not support nested types. If we are forced to save a dataframe
into those data sources, we might be able to work around by this function.
The usage is as below:
```scala
val df = Seq(Tuple1(Tuple1(1))).toDF("a")
df.select(to_json($"a").as("json")).show()
```
```bash
+--------+
| json|
+--------+
|{"_1":1}|
+--------+
```
## How was this patch tested?
Unit tests in `JsonFunctionsSuite` and `JsonExpressionsSuite`.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/HyukjinKwon/spark SPARK-17764
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/15354.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #15354
----
commit a33ac60902f233595158ed034b0fa49bcf9ac5ab
Author: hyukjinkwon <[email protected]>
Date: 2016-10-05T00:56:08Z
Initial implementation
commit d382b6364fd80a90375573e3fb68b9db2bf3cffb
Author: hyukjinkwon <[email protected]>
Date: 2016-10-05T02:42:00Z
Fix minor comment nits
commit eec0cd32bde8564a080da425be48986055523e8c
Author: hyukjinkwon <[email protected]>
Date: 2016-10-05T02:50:45Z
Add a missing dot
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]