On the front page http://spark.incubator.apache.org/ of the Spark
website there is the following simple word count implementation:
file = spark.textFile(hdfs://...)
file.flatMap(line = line.split( )).map(word = (word,
1)).reduceByKey(_ + _)
The same code can be found in the Quick Start
Thanks - I think this would be a helpful note to add to the docs. I
went and read a few things about Scala implicit conversions (I'm
obviously new to the language) and it seems like a very powerful
language feature and now that I know about them it will certainly be
easy to identify when they
Yeah, it’s true that this feature doesn’t provide any way to give good error
messages. Maybe some IDEs will support it eventually, though I haven’t seen it.
Matei
On Nov 7, 2013, at 3:46 PM, Philip Ogren philip.og...@oracle.com wrote:
Thanks - I think this would be a helpful note to add to