Hi,
we have spark jobs written totally in python similar to repo
https://github.com/AlexIoannides/pyspark-example-project,
we are using spark-submit to submit the application in the local mode, but
want to send metrics when the job ends (on SIGTERM as well), to do so we
need something similar to shutdown hook in java/scala, can somebody help
how to do it ? I couldn't fine the shutdown hook interface in pyspark ?
-- 
Regards,
Shriraj Bhardwaj


*कर्मण्येवाधिकारस्ते मा फलेषु कदाचन। मा कर्मफलहेतुर्भूर्मा ते
सङ्गोऽस्त्वकर्मणि॥ *

Reply via email to