[
https://issues.apache.org/jira/browse/SPARK-12623?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Sean Owen resolved SPARK-12623.
-------------------------------
Resolution: Not A Problem
It's mapping values to values, as the name implies. You can always write
(pardon the Scala):
{code}
rdd.map { case (key, value) => (key, yourFunctionOf(key, value)) }
{code}
... to change values based on keys too. It doesn't need a special method.
Generally you should start with user@, not a JIRA.
> map key_values to values
> ------------------------
>
> Key: SPARK-12623
> URL: https://issues.apache.org/jira/browse/SPARK-12623
> Project: Spark
> Issue Type: New Feature
> Components: Spark Core
> Reporter: Elazar Gershuni
> Priority: Minor
> Labels: easyfix, features, performance
> Original Estimate: 0.5h
> Remaining Estimate: 0.5h
>
> Why doesn't the argument to mapValues() take a key as an agument?
> Alternatively, can we have a "mapKeyValuesToValues" that does?
> Use case: I want to write a simpler analyzer that takes the argument to
> map(), and analyze it to see whether it (trivially) doesn't change the key,
> e.g.
> g = lambda kv: (kv[0], f(kv[0], kv[1]))
> rdd.map(g)
> Problem is, if I find that it is the case, I can't call mapValues() with that
> function, as in `rdd.mapValues(lambda kv: g(kv)[1])`, since mapValues
> receives only `v` as an argument.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]