Hi Stelios,

Thank you so much for your help.

If I use lit it gives an error of column not iterable.

Can you suggest a simple way of achieving my use case? I need to send the
entire column record by record to the API in JSON format.


TIA,
Sid

On Fri, Jun 10, 2022 at 2:51 PM Stelios Philippou <stevo...@gmail.com>
wrote:

> Sid
> Then the issue is on the data in the way you are creating them for that
> specific column.
>
> call_to_cust_bulk_api(policyUrl,to_json(struct(*colsListToBePassed)))
>
> Perhaps wrap that in a
>
> lit(call_to_cust_bulk_api(policyUrl,to_json(struct(*colsListToBePassed))))
>
> else you will need to start sending simpler data there to make sure that the 
> API works
>
>
> On Fri, 10 Jun 2022 at 12:15, Sid <flinkbyhe...@gmail.com> wrote:
>
>> Still,  it is giving the same error.
>>
>> On Fri, Jun 10, 2022 at 5:13 AM Sean Owen <sro...@gmail.com> wrote:
>>
>>> That repartition seems to do nothing? But yes the key point is use col()
>>>
>>> On Thu, Jun 9, 2022, 9:41 PM Stelios Philippou <stevo...@gmail.com>
>>> wrote:
>>>
>>>> Perhaps
>>>>
>>>>
>>>> finalDF.repartition(finalDF.rdd.getNumPartitions()).withColumn("status_for_batch
>>>>
>>>> To
>>>>
>>>> finalDF.repartition(finalDF.rdd.getNumPartitions()).withColumn(col("status_for_batch")
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, 9 Jun 2022, 22:32 Sid, <flinkbyhe...@gmail.com> wrote:
>>>>
>>>>> Hi Experts,
>>>>>
>>>>> I am facing one problem while passing a column to the method.  The
>>>>> problem is described in detail here:
>>>>>
>>>>>
>>>>> https://stackoverflow.com/questions/72565095/how-to-pass-columns-as-a-json-record-to-the-api-method-using-pyspark
>>>>>
>>>>> TIA,
>>>>> Sid
>>>>>
>>>>

Reply via email to