Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/5147#issuecomment-85275593
  
    I'm a little hesitant to want to add a new `withPartition` or 
`withSplit`-like method, since we've been deprecating those in favor of using 
things like TaskContext.  Can you address your use-case with `TaskContext.get` 
and `printPipeContext`?  For example, how about this:
    
    ```scala
    myRDD.pipe(
       command="....",
       printPipeContext = (p => p("PARTITION=" + TaskContext.get.partitionId())
    ```
    
    Is the problem that you want to be able to store the partition as part of 
the command or environment?  If so, then maybe we could generalize this so that 
the function is invoked with a TaskContext instead of a Partition (in other 
words, change it to pipeWithContext).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to