Avoid groupby and use reducebykey.

Regards,
Vaquar khan

On Jun 4, 2017 8:32 AM, "Guy Cohen" <g...@gettaxi.com> wrote:

> Try this one:
>
> df.groupBy(
>   when(expr("field1='foo'"),"field1").when(expr("field2='bar'"),"field2"))
>
>
> On Sun, Jun 4, 2017 at 3:16 AM, Bryan Jeffrey <bryan.jeff...@gmail.com>
> wrote:
>
>> You should be able to project a new column that is your group column.
>> Then you can group on the projected column.
>>
>> Get Outlook for Android <https://aka.ms/ghei36>
>>
>>
>>
>>
>> On Sat, Jun 3, 2017 at 6:26 PM -0400, "upendra 1991" <
>> upendra1...@yahoo.com.invalid> wrote:
>>
>> Use a function
>>>
>>> Sent from Yahoo Mail on Android
>>> <https://overview.mail.yahoo.com/mobile/?.src=Android>
>>>
>>> On Sat, Jun 3, 2017 at 5:01 PM, kant kodali
>>> <kanth...@gmail.com> wrote:
>>> Hi All,
>>>
>>> Is there a way to do conditional group by in spark 2.1.1? other words, I
>>> want to do something like this
>>>
>>> if (field1 == "foo") {
>>>        df.groupBy(field1)
>>> } else if (field2 == "bar")
>>>       df.groupBy(field2)
>>>
>>> Thanks
>>>
>>>
>

Reply via email to