Hi,

I wonder any suggestions on how to use SparkOperator to send pyspark file
to the spark cluster. And any suggestions on how to specify the pyspark
dependency ?

We currently push user pyspark file and dependency to a S3 location and get
picked up by our Spark cluster. And we would like to explore and see if
there are suggestions on how to improve the workflow.

Thanks,
-Tao

Reply via email to