Github user dragos commented on the pull request:
https://github.com/apache/spark/pull/10729#issuecomment-173225458
@nraychaudhuri it doesn't hang indeed, but it doesn't stop the Spark Shell
either. Since there's no Mesos driver, it won't get any resources and no job
can progress.
```
$ bin/spark-shell --master mesos://192.168.99.100:5050 --conf
spark.mesos.role=aaa
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.0-SNAPSHOT
/_/
Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java
1.8.0_45)
Type in expressions to have them evaluated.
Type :help for more information.
16/01/20 15:47:08 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
I0120 15:47:08.975942 65032192 sched.cpp:164] Version: 0.25.0
I0120 15:47:08.977934 59678720 sched.cpp:262] New master detected at
[email protected]:5050
I0120 15:47:08.978126 59678720 sched.cpp:272] No credentials provided.
Attempting to register without authentication
I0120 15:47:08.981176 61825024 sched.cpp:1024] Got error 'Role 'aaa' is not
present in the master's --roles'
I0120 15:47:08.981202 61825024 sched.cpp:1805] Asked to abort the driver
E0120 15:47:08.981292 63971328 socket.hpp:174] Shutdown failed on fd=112:
Socket is not connected [57]
16/01/20 15:47:08 ERROR CoarseMesosSchedulerBackend: Mesos error: Role
'aaa' is not present in the master's --roles
Exception in thread "Thread-14" org.apache.spark.SparkException: Exiting
due to error from cluster scheduler: Role 'aaa' is not present in the master's
--roles
at
org.apache.spark.scheduler.TaskSchedulerImpl.error(TaskSchedulerImpl.scala:438)
at
org.apache.spark.scheduler.cluster.mesos.CoarseMesosSchedulerBackend.error(CoarseMesosSchedulerBackend.scala:364)
I0120 15:47:08.982903 61825024 sched.cpp:1805] Asked to abort the driver
I0120 15:47:08.982928 61825024 sched.cpp:1070] Aborting framework ''
16/01/20 15:47:08 WARN CoarseMesosSchedulerBackend: Application ID is not
initialized yet.
16/01/20 15:47:08 ERROR CoarseMesosSchedulerBackend: Error starting driver,
DRIVER_ABORTED
16/01/20 15:47:09 INFO SparkILoop: Created spark context..
Spark context available as sc (master = mesos://192.168.99.100:5050, app id
= spark-application-1453301228917).
16/01/20 15:47:09 INFO SparkILoop: Created sql context..
SQL context available as sqlContext.
scala>
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]