Hello spent a lot of time to find what I did wrong , but not found.
I have a minikube WIndows based cluster ( Hyper V as hypervisor ) and try
to run examples against Spark 2.3. Tried several docker images builds:
* several builds that I build myself
* andrusha/spark-k8s:2.3.0-hadoop2.7 from docker hub
But when I try to submit job driver log returns class not found exception
spark-submit --master k8s://https://ip:8443 --deploy-mode cluster --name
spark-pi --class org.apache.spark.examples.SparkPi --conf
spark.executor.instances=1 --executor-memory 1G --conf spark.kubernete
I tried to use https://github.com/apache-spark-on-k8s/spark fork and it is
works without problems, more complex examples work also.