Hello,
Is there any way to get the attempt number in a closure? Seems
TaskContext.attemptId actually returns the taskId of a task (see this
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/executor/Executor.scala#L181
and this
https://github.com/apache/spark/blob
wrote:
Hello,
Is there any way to get the attempt number in a closure? Seems
TaskContext.attemptId actually returns the taskId of a task (see this
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/executor/Executor.scala#L181
and this
https://github.com
think part of the problem is that we don't actually have the attempt id
on the executors. If we do, that's great. If not, we'd need to propagate
that over.
On Mon, Oct 20, 2014 at 7:17 AM, Yin Huai huaiyin@gmail.com wrote:
Hello,
Is there any way to get the attempt number in a closure
on the executors. If we do, that's great. If not, we'd need to propagate
that over.
On Mon, Oct 20, 2014 at 7:17 AM, Yin Huai huaiyin@gmail.com wrote:
Hello,
Is there any way to get the attempt number in a closure? Seems
TaskContext.attemptId actually returns the taskId of a task (see
, Yin Huai huaiyin@gmail.com
wrote:
Hello,
Is there any way to get the attempt number in a closure? Seems
TaskContext.attemptId actually returns the taskId of a task (see this
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/executor/Executor.scala
great. If not, we'd need to
propagate
that over.
On Mon, Oct 20, 2014 at 7:17 AM, Yin Huai huaiyin@gmail.com
wrote:
Hello,
Is there any way to get the attempt number in a closure? Seems
TaskContext.attemptId actually returns the taskId of a task (see this
https
need to
propagate
that over.
On Mon, Oct 20, 2014 at 7:17 AM, Yin Huai huaiyin@gmail.com
wrote:
Hello,
Is there any way to get the attempt number in a closure? Seems
TaskContext.attemptId actually returns the taskId of a task (see this
https://github.com/apache/spark/blob
to get the attempt number in a closure? Seems
TaskContext.attemptId actually returns the taskId of a task (see
this
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/executor/Executor.scala#L181
and this
https://github.com/apache/spark/blob/master