Thanks for posting the solution! You can also append `% "provided"` to
the `spark-mllib` dependency line and remove `spark-core` (because
spark-mllib already depends on spark-core) to make the assembly jar
smaller. -Xiangrui

On Fri, Aug 8, 2014 at 10:05 AM, SK <skrishna...@gmail.com> wrote:
> i was using sbt package when I got this error. Then I switched to using sbt
> assembly and that solved the issue. To run "sbt assembly", you need to have
> a file called plugins.sbt in the  "<project root>/project" directory and it
> has  the following line:
>
> addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
>
> This is in addition to the  <project name>.sbt file I mentioned in the
> earlier mail.
>
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/scopt-OptionParser-tp8436p11800.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to