When I run "sbt assembly", I use the "provided" configuration in the
build.sbt library dependency, to avoid conflicts in the fat jar: 

libraryDependencies += "org.apache.spark" %% "spark-core" %
"0.8.1-incubating" % "provided"

But if I want to do "sbt run", I have to remove the "provided," otherwise it
doesn't find the Spark classes.

Is there a way to set up my build.sbt so that it does the right thing in
both cases, without monkeying with my build.sbt each time?





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/libraryDependencies-configuration-is-different-for-sbt-assembly-vs-sbt-run-tp565.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to