Hi Kevin,

I had a similar use case (see below code) but with something that wasn’t
spark related. I think the below should work for you, you may need to edit
the context variable to suit your needs but hopefully it gives the general
idea of sharing a single object between multiple threads.

Thanks


from threading import Thread

context = pyspark.sql.SparkSession.builder.appName("spark").getOrCreate()

t1 = Thread(target=order_creator, args=(app_id, sleep_time,))
t1.start(target=my_func, args=(context,))

t2 = Thread(target=order_creator, args=(app_id, sleep_time,))
t2.start(target=my_func, args=(context,))

Reply via email to