Why don't you install sbt and try sbt assembly to create a scala jar. You can using this jar to your spark submit jobs.
In case there are additional dependencies these can be passed as --jars (comma separated jar paths) option to spark submit. On Wed, Jul 27, 2016 at 11:53 AM, <luohui20...@sina.com> wrote: > Hi there: > I export a project into jar like this "right click my project->choose > export ->java-> jar file-> next->choose "src/main/resouces" and > ''src/main/scala"'-> clikc browse and choose a jar file export location-> > choose overwrite it", and this jar is unable to run with "java -jar > myjar.jar". It says "no main manifest attribute, in /opt/MyJar.jar". It > seems the file MANIFEST.MF lost the main class info when I export the jar. > I opened the MANIFEST.MF in my exported jar, and finds only below info: > Manifest-Version: 1.0 > > > > However I checked the MANIFEST.MF in my project, it is like this: > Manifest-Version: 1.0 > Main-Class: org.abc.spark.streaming.Producer > Class-Path: lib/spark-core_2.10-1.6.1.jar ........... > > in my class of producer, I didn't use spark, so I use java -jar to run it. > > I also tried to export my project as a runnable jar, but my class can not > be run as a java application, the run button is not able to click. > > So could anyone share a hand? > > -------------------------------- > > Thanks&Best regards! > San.Luo >