Github user tgravescs commented on the pull request:
https://github.com/apache/spark/pull/5130#issuecomment-85015152
That case is basically not handled right now. We expect one of the first
things is to create the SparkContext which is why the AM waits for the spark
context to be initialized. Anything you do in your program before doing that
initialization is relying on the fact that we wait a certain period for it to
be initialized and if you never create it, we consider that as failure. Seems
like more of a think a workflow manager should be doing but if you want to
handle that case I suggest filing a separate jira.
val sc = waitForSparkContextInitialized()
// If there is no SparkContext at this point, just fail the app.
if (sc == null) {
finish(FinalApplicationStatus.FAILED,
ApplicationMaster.EXIT_SC_NOT_INITED,
"Timed out waiting for SparkContext.")
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]