Hi ,
I have created a custom Estimator in scala, which i can use successfully by
creating a pipeline model in Java and scala, But when i try to load the
pipeline model saved using scala api in pyspark, i am getting an error
saying module not found.
I have included my custom model jar in the class
Hi,
I am not using Airflow but I assume your application is deployed in cluster
mode and in this case the class you are looking for is
*org.apache.spark.deploy.k8s.submit.Client* [1].
If we are talking about the first "spark-submit" used to start the
application and not "spark-submit --status" th