Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/22577#discussion_r221272009
--- Diff:
core/src/main/scala/org/apache/spark/status/api/v1/OneApplicationResource.scala
---
@@ -175,7 +175,7 @@ private[v1] class OneApplicationAttemptResource extends
AbstractApplicationResou
def getAttempt(): ApplicationAttemptInfo = {
uiRoot.getApplicationInfo(appId)
.flatMap { app =>
- app.attempts.filter(_.attemptId == attemptId).headOption
--- End diff --
I don't think it's a compile error, but a warning, which might be new in
2.12. I tried to fix up a bunch of these in
https://github.com/apache/spark/commit/cfbdd6a1f5906b848c520d3365cc4034992215d9
for example.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]