I found in general it's a pain to build/run Spark inside IntelliJ IDEA. I guess most people resort to this approach so that they can leverage the integrated debugger to debug and/or learn Spark internals. A more convenient way I'm using recently is resorting to the remote debugging feature. In this way, by adding driver/executor Java options, you may build and start the Spark applications/tests/daemons in the normal way and attach the debugger to it. I was using this to debug the HiveThriftServer2, and it worked perfectly.

Steps to enable remote debugging:

1. Menu "Run / Edit configurations..."
2. Click the "+" button, choose "Remote"
3. Choose "Attach" or "Listen" in "Debugger mode" according to your actual needs 4. Copy, edit, and add Java options suggested in the dialog to `--driver-java-options` or `--executor-java-options` 5. If you're using attaching mode, first start your Spark program, then start remote debugging in IDEA 6. If you're using listening mode, first start remote debugging in IDEA, and then start your Spark program.

Hope this can be helpful.

Cheng

On 4/4/15 12:54 AM, sara mustafa wrote:
Thank you, it works with me when I changed the dependencies from provided to
compile.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-Runtime-error-tp11383p11385.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org




---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to