I am trying to debug Spark application running on eclipse in
clustered/distributed environment but not able to succeed. Application is
java based and I am running it through Eclipse. Configurations to spark for
Master/worker is provided through Java only.

Though I can debug the code on driver side but as the code flow moves in
Spark(i.e call to .map(..)), the debugger doesn't stop. Because that code is
running in Workers JVM.

Is there anyway I can achieve this ?

I have tried giving following configurations in Tomcat through Eclipse :
-Xdebug -Xrunjdwp:server=y,transport=dt_socket,address=7761,suspend=n

and setting respective port in Debug->remote java application.

But after these settings I get the error: Failed to connect to remote VM.
Connection Refused

Note: I have tried this on Windows as well as on Linux(CentOs) environment.

If anybody has any solution to this, please help.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Debugging-Apache-Spark-clustered-application-from-Eclipse-tp23483.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to