I think I figured it out. I am playing around with the Cassandra connector
and I had a method that inserted some data into a locally-running Cassandra
instance, but I forgot to close the Cluster object. I guess that left some
non-daemon thread running and kept the process for exiting. Nothing to
It used to exit without any problem for me. You can basically check in the
driver UI (that runs on 4040) and see what exactly its doing.
Thanks
Best Regards
On Fri, May 1, 2015 at 6:22 PM, James Carman
wrote:
> In all the examples, it seems that the spark application doesn't really do
> anythin
No, you don’t need to do anything special. Perhaps, your application is getting
stuck somewhere? If you can share your code, someone may be able to help.
Mohammed
From: James Carman [mailto:ja...@carmanconsulting.com]
Sent: Friday, May 1, 2015 5:53 AM
To: user@spark.apache.org
Subject: Exiting "