Github user devaraj-kavali commented on the pull request:
https://github.com/apache/spark/pull/10129#issuecomment-162022777
Thanks @tgravescs for the details. I missed it before creating PR.
I am thinking these ways for supporting <2.4 Apache Hadoop versions and as
well as for >=2.4 Apache Hadoop versions.
1. For supporting <2.4 versions, we can provide implementation in
ApplicationMaster for handling AM_RESYNC and AM_SHUTDOWN commands. And this
AM_RESYNC and AM_SHUTDOWN commands handling can be removed when these
deprecated commands remove from the later versions of Apache Hadoop.
2. For supporting >=2.4 versions(i.e. to avoid unnecessary retries when RM
throws ApplicationAttemptNotFoundException), we can have exception check like
"ApplicationAttemptNotFoundException".equals(t.getClass().getName()) without
referring the ApplicationAttemptNotFoundException class directly in the
ApplicationMaster.scala.
case e: Throwable => {
if
("ApplicationAttemptNotFoundException".equals(t.getClass().getName())) {
val message = "ApplicationAttemptNotFoundException was
thrown from Reporter thread.";
logError(message, a)
finish(FinalApplicationStatus.FAILED,
ApplicationMaster.EXIT_REPORTER_FAILURE,
message)
}
failureCount += 1
if (!NonFatal(e) || failureCount >= reporterMaxFailures) {
.......
}
And this code can be changed to refer ApplicationAttemptNotFoundException
class directly when we withdraw the support for <2.4 Hadoop versions.
Please provide your suggestions.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]