Hi,
I don't think there are any sbt-related changes in Spark 2.0. Just
different versions in libraryDependencies.
As to the article, I'm surprised it didn't mention using sbt-assembly
[1] for docker-like deployment or sbt-native-packager [2] that could
create a Docker image.
[1] https://github.c
Just wondering
Whats is the correct way of building a spark job using scala - are there
any changes coming with spark v2
Ive been following this post
http://www.infoobjects.com/spark-submit-with-sbt/
Then again Ive been mainly using docker locally what is decent container
for submitting these