Re: Exiting driver main() method...

2015-05-04 Thread James Carman
I think I figured it out.  I am playing around with the Cassandra connector
and I had a method that inserted some data into a locally-running Cassandra
instance, but I forgot to close the Cluster object.  I guess that left some
non-daemon thread running and kept the process for exiting.  Nothing to see
here, move along.  :)


On Sat, May 2, 2015 at 2:44 PM Mohammed Guller moham...@glassbeam.com
wrote:

  No, you don’t need to do anything special. Perhaps, your application is
 getting stuck somewhere? If you can share your code, someone may be able to
 help.



 Mohammed



 *From:* James Carman [mailto:ja...@carmanconsulting.com]
 *Sent:* Friday, May 1, 2015 5:53 AM
 *To:* user@spark.apache.org
 *Subject:* Exiting driver main() method...



 In all the examples, it seems that the spark application doesn't really do
 anything special in order to exit.  When I run my application, however, the
 spark-submit script just hangs there at the end.  Is there something
 special I need to do to get that thing to exit normally?



Troubling Logging w/Simple Example (spark-1.2.2-bin-hadoop2.4)...

2015-05-04 Thread James Carman
I have the following simple example program:

public class SimpleCount {

public static void main(String[] args) {
final String master = System.getProperty(spark.master,
local[*]);
System.out.printf(Running job against spark master %s ...%n,
master);

final SparkConf conf = new SparkConf()
.setAppName(simple-count)
.setMaster(master)
.set(spark.eventLog.enabled, true);
final JavaSparkContext sc = new JavaSparkContext(conf);

JavaRDDInteger rdd = sc.parallelize(Arrays.asList(1, 2, 3, 4, 5,
6, 7, 8, 9, 10));

long n = rdd.count();

System.out.printf(I counted %d integers.%n, n);
}
}

I start a local master:

export SPARK_MASTER_IP=localhost

sbin/start-master.sh


Then, I start a local worker:


bin/spark-class org.apache.spark.deploy.worker.Worker -h localhost
spark://localhost:7077



When I run the example application:


bin/spark-submit --class com.cengage.analytics.SimpleCount  --master
spark://localhost:7077
~/IdeaProjects/spark-analytics/target/spark-analytics-1.0-SNAPSHOT.jar


It finishes just fine (and even counts the right number :).  However, I get
the following log statements in the master's log file:


15/05/04 09:54:14 INFO Master: Registering app simple-count

15/05/04 09:54:14 INFO Master: Registered app simple-count with ID
app-20150504095414-0009

15/05/04 09:54:14 INFO Master: Launching executor app-20150504095414-0009/0
on worker worker-20150504095401-localhost-55806

15/05/04 09:54:17 INFO Master: akka.tcp://sparkDriver@jamess-mbp:55939 got
disassociated, removing it.

15/05/04 09:54:17 INFO Master: Removing app app-20150504095414-0009

15/05/04 09:54:17 WARN ReliableDeliverySupervisor: Association with remote
system [akka.tcp://sparkDriver@jamess-mbp:55939] has failed, address is now
gated for [5000] ms. Reason is: [Disassociated].

15/05/04 09:54:17 INFO LocalActorRef: Message
[akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from
Actor[akka://sparkMaster/deadLetters] to
Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%40127.0.0.1%3A55948-17#800019242]
was not delivered. [18] dead letters encountered. This logging can be
turned off or adjusted with configuration settings 'akka.log-dead-letters'
and 'akka.log-dead-letters-during-shutdown'.

15/05/04 09:54:17 INFO SecurityManager: Changing view acls to: jcarman

15/05/04 09:54:17 INFO SecurityManager: Changing modify acls to: jcarman

15/05/04 09:54:17 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(jcarman);
users with modify permissions: Set(jcarman)

15/05/04 09:54:17 INFO Master: akka.tcp://sparkDriver@jamess-mbp:55939 got
disassociated, removing it.

15/05/04 09:54:17 WARN EndpointWriter: AssociationError
[akka.tcp://sparkMaster@localhost:7077] -
[akka.tcp://sparkWorker@localhost:51252]:
Error [Invalid address: akka.tcp://sparkWorker@localhost:51252] [

akka.remote.InvalidAssociation: Invalid address:
akka.tcp://sparkWorker@localhost:51252

Caused by: akka.remote.transport.Transport$InvalidAssociationException:
Connection refused: localhost/127.0.0.1:51252

]

15/05/04 09:54:17 WARN Remoting: Tried to associate with unreachable remote
address [akka.tcp://sparkWorker@localhost:51252]. Address is now gated for
5000 ms, all messages to this address will be delivered to dead letters.
Reason: Connection refused: localhost/127.0.0.1:51252

15/05/04 09:54:17 INFO Master: akka.tcp://sparkWorker@localhost:51252 got
disassociated, removing it.

15/05/04 09:54:17 WARN EndpointWriter: AssociationError
[akka.tcp://sparkMaster@localhost:7077] -
[akka.tcp://sparkWorker@jamess-mbp:50071]: Error [Invalid address:
akka.tcp://sparkWorker@jamess-mbp:50071] [

akka.remote.InvalidAssociation: Invalid address:
akka.tcp://sparkWorker@jamess-mbp:50071

Caused by: akka.remote.transport.Transport$InvalidAssociationException:
Connection refused: jamess-mbp/192.168.1.45:50071

]

15/05/04 09:54:17 WARN Remoting: Tried to associate with unreachable remote
address [akka.tcp://sparkWorker@jamess-mbp:50071]. Address is now gated for
5000 ms, all messages to this address will be delivered to dead letters.
Reason: Connection refused: jamess-mbp/192.168.1.45:50071

15/05/04 09:54:17 INFO Master: akka.tcp://sparkWorker@jamess-mbp:50071 got
disassociated, removing it.

15/05/04 09:54:17 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef:
Message [org.apache.spark.deploy.DeployMessages$ApplicationFinished] from
Actor[akka://sparkMaster/user/Master#-1247271270] to
Actor[akka://sparkMaster/deadLetters] was not delivered. [19] dead letters
encountered. This logging can be turned off or adjusted with configuration
settings 'akka.log-dead-letters' and
'akka.log-dead-letters-during-shutdown'.

15/05/04 09:54:17 INFO RemoteActorRefProvider$RemoteDeadLetterActorRef:
Message 

Exiting driver main() method...

2015-05-01 Thread James Carman
In all the examples, it seems that the spark application doesn't really do
anything special in order to exit.  When I run my application, however, the
spark-submit script just hangs there at the end.  Is there something
special I need to do to get that thing to exit normally?