Hi,

The TaskContext isn't currently exposed in PySpark but I've been meaning to
look at exposing at least some of TaskContext for parity in PySpark. Is
there a particular use case which you want this for? Would help with
crafting the JIRA :)

Cheers,

Holden :)

On Thu, Nov 24, 2016 at 1:39 AM, ofer <ofer.elias...@gmail.com> wrote:

> Hi,
> Is there a way to get in PYSPARK something like TaskContext from a code
> running on executor like in scala spark?
>
> If not - how can i know my task id from inside the executors?
>
> Thanks!
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/PySpark-TaskContext-tp28125.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau

Reply via email to