Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14659#discussion_r79419642
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/Task.scala ---
    @@ -54,7 +54,10 @@ private[spark] abstract class Task[T](
         val partitionId: Int,
         // The default value is only used in tests.
         val metrics: TaskMetrics = TaskMetrics.registered,
    -    @transient var localProperties: Properties = new Properties) extends 
Serializable {
    +    @transient var localProperties: Properties = new Properties,
    +    val jobId: Option[Int] = None,
    --- End diff --
    
    are these params all optional just to make it easier for different task 
types?  the jobId and appId I think are mandatory now, the appattempt id is 
still really optional.  I'm leaning towards making this not be Option so that 
is someone adds a new Task Type we make sure these are setup properly and thus 
context set properly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to