I'm relatively new to Spark and have got a couple of questions:

*
 I've got an IntelliJ SBT project that's using Spark Streaming with a 
custom RabbitMQ receiver in the same project. When I run it against 
local[2], all's well. When I put in spark://masterip:7077, I get a 
ClassNotFoundException for RmqReceiver (the name of the custom 
receiver). Note, this is being executed inside IntelliJ, and no jars are
 built in the target folder. I guess using spark-submit would work, but 
was wondering if there's a way to simply run the app in IntelliJ and 
have it work.
* I see there's an sc.addJars(..) method that would (I 
imagine) submit additional jars. Is there a way for it to submit the 
"current project's" classes as well. Or would building and submitting 
the package take care of this?

Any pointers are appreciated.

Regards,
Ashic.                                    

Reply via email to