Cool - thanks. I'll circle back with the JIRA number once I've got it
created - will probably take awhile before it lands in a Spark release
(since 2.1 has already branched) but better debugging information for
Python users is certainly important/useful.

On Thu, Nov 24, 2016 at 2:03 AM, Ofer Eliassaf <ofer.elias...@gmail.com>
wrote:

> Since we can't work with log4j in pyspark executors we build our own
> logging infrastructure (based on logstash/elastic/kibana).
> Would help to have TID in the logs, so we can drill down accordingly.
>
>
> On Thu, Nov 24, 2016 at 11:48 AM, Holden Karau <hol...@pigscanfly.ca>
> wrote:
>
>> Hi,
>>
>> The TaskContext isn't currently exposed in PySpark but I've been meaning
>> to look at exposing at least some of TaskContext for parity in PySpark. Is
>> there a particular use case which you want this for? Would help with
>> crafting the JIRA :)
>>
>> Cheers,
>>
>> Holden :)
>>
>> On Thu, Nov 24, 2016 at 1:39 AM, ofer <ofer.elias...@gmail.com> wrote:
>>
>>> Hi,
>>> Is there a way to get in PYSPARK something like TaskContext from a code
>>> running on executor like in scala spark?
>>>
>>> If not - how can i know my task id from inside the executors?
>>>
>>> Thanks!
>>>
>>>
>>>
>>> --
>>> View this message in context: http://apache-spark-user-list.
>>> 1001560.n3.nabble.com/PySpark-TaskContext-tp28125.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>>
>>>
>>
>>
>> --
>> Cell : 425-233-8271
>> Twitter: https://twitter.com/holdenkarau
>>
>
>
>
> --
> Regards,
> Ofer Eliassaf
>



-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau

Reply via email to