bjornjorgensen commented on a change in pull request #35296:
URL: https://github.com/apache/spark/pull/35296#discussion_r794889539
##########
File path: python/pyspark/pandas/generic.py
##########
@@ -980,6 +983,9 @@ def to_json(
"""
if "options" in options and isinstance(options.get("options"), dict)
and len(options) == 1:
options = options.get("options")
+
+ default_options: Dict[str, Any] = {"ignoreNullFields": False}
+ options = {**default_options, **options}
Review comment:
#dev/reformat-python
reformatted python/pyspark/pandas/generic.py
All done! ✨ 🍰 ✨
1 file reformatted, 367 files left unchanged.
Now flake8 print F821 undefined name 'Dict'
This error is a bug in pyflakes [Why does flake8 give the warning f821
undefined name match when using
match](https://stackoverflow.com/questions/67547135/why-does-flake8-give-the-warning-f821-undefined-name-match-when-using-match)
We can add # noqa: F821 but then we need to remove this when pyflakes is
fixed. I don’t think we should do that.
I have tested this manually. With
[bjornjorgensen/spark-notebook:spark-test260122](https://hub.docker.com/layers/bjornjorgensen/spark-notebook/spark-test260122/images/sha256-4738a07b6605b6d84875a2dd604760b67f841ccf2167ec18440df3b11583de3f?context=repo&tab=layers)
docker image.
`data = {'col_1': [3, 2, 1, 0], 'col_2': [None, None, None, None]}
test = ps.DataFrame.from_dict(data)
test.to_json("test.json")
test2 = ps.read_json("test.json/*")
test2
col_1 col_2
0 3 None
1 2 None
2 1 None
3 0 None
test2.to_json("test2.json", ignoreNullFields=True)
test3 = ps.read_json("test2.json/*")
test3
col_1
0 3
1 2
2 1
3 0`
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]