We can reproduce this using the following code:
val spark = SparkSession.builder().appName("test").master("local").getOrCreate()
val sql1 =
"""
|create temporary view tb as select * from values
|(1, 0),
|(1, 0),
|(2, 0)
|as grouping(a, b)
""".stripMargin
val sql =
"""
Hello Dev / Users,
I am working with PySpark Code migration to scala, with Python - Iterating
Spark with dictionary and generating JSON with null is possible with
json.dumps() which will be converted to SparkSQL[Row] but in scala how can
we generate json will null values as a Dataframe ?
Thanks.
Do we need a VOTE? heck I think anyone can call one, anyway.
Pre-flight vote check: anyone have objections to the text as-is?
See
https://docs.google.com/document/d/1-Zdi_W-wtuxS9hTK0P9qb2x-nRanvXmnZ7SUi4qMljg/edit#
If so let's hash out specific suggest changes.
If not, then I think the next ste
Another week, another ping. Anyone on the PMC willing to call a vote on
this?
On Mon, Feb 27, 2017 at 3:08 PM, Ryan Blue wrote:
> I'd like to see more discussion on the issues I raised. I don't think
> there was a response for why voting is limited to PMC members.
>
> Tim was kind enough to rep