Hello everyone here is a case that i am facing,

i have a pyspark application that as it's last step is to create a pyspark
dataframe with two columns
(column1, column2). This dataframe has only one row and i want this row to
be inserted in a postgres db table. In every run this line in the dataframe
may be different or the same.

I want after 10 runs to have 10 rows in my postgres table, so i want to
insert that dataframe in a postgres table in every run. What i have done up
to now is to use the below code but it doesn't insert the new row after
every run of my pyspark application, it just overwrites the old row.

Here is my code:

test_write_df.write.mode('append').options(url='jdbc:postgresql://localhost:5432/Test_Db',
dbtable='test_write_df',driver='org.postgresql.Driver',user='postgres',
password='my_password')

Can you please consult me if is this feasible to be happened and/or how to
achieve this?

Thank you in advance

Reply via email to