You could create a new column based on the expression: IF (condition1,
value1, old_column_value)

On Mon, Nov 23, 2015 at 11:57 AM, Vishnu Viswanath
<vishnu.viswanat...@gmail.com> wrote:
> Thanks for the reply Davies
>
> I think replace, replaces a value with another value. But what I want to do
> is fill in the null value of a column.( I don't have a to_replace here )
>
> Regards,
> Vishnu
>
> On Mon, Nov 23, 2015 at 1:37 PM, Davies Liu <dav...@databricks.com> wrote:
>>
>> DataFrame.replace(to_replace, value, subset=None)
>>
>>
>> http://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame.replace
>>
>> On Mon, Nov 23, 2015 at 11:05 AM, Vishnu Viswanath
>> <vishnu.viswanat...@gmail.com> wrote:
>> > Hi
>> >
>> > Can someone tell me if there is a way I can use the fill method in
>> > DataFrameNaFunctions based on some condition.
>> >
>> > e.g., df.na.fill("value1","column1","condition1")
>> >         df.na.fill("value2","column1","condition2")
>> >
>> > i want to fill nulls in column1 with values - either value 1 or value 2,
>> > based on some condition.
>> >
>> > Thanks,
>
>
>
>
> --
> Thanks and Regards,
> Vishnu Viswanath
> +1 309 550 2311
> www.vishnuviswanath.com

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to