Hi all, In word count example,
val textFile = sc.textFile("Sample.txt") val counts = textFile.flatMap(line => line.split(" ")) .map(word => (word, 1)) .reduceByKey(_ + _) counts.saveAsTextFile("hdfs://master:8020/user/abc") I want to write collection of "*counts" *which is used in code above to HDFS, so val x = counts.collect() Actually I want to write *x *to HDFS. But spark wants to RDD to write sometihng to HDFS How can I write Array[(String,Int)] to HDFS -- Uğur