Re: SparkContext declared as object variable

2015-09-24 Thread Priya Ch
object StreamJob { val conf = new SparkConf val sc = new SparkContext(conf) def main(args:Array[String]) { val baseRDD = sc.parallelize(Array("hi","hai","hi","bye","bye","hi","hai","hi","bye","bye")) val words = baseRDD.flatMap(line => line.split(",")) val wordPairs = words.map(

Re: SparkContext declared as object variable

2015-09-23 Thread Akhil Das
Yes of course it works. [image: Inline image 1] Thanks Best Regards On Tue, Sep 22, 2015 at 4:53 PM, Priya Ch wrote: > Parallelzing some collection (array of strings). Infact in our product we > are reading data from kafka using KafkaUtils.createStream and applying some > transformations. > >

Re: SparkContext declared as object variable

2015-09-22 Thread Akhil Das
Its a "value" not a variable, and what are you parallelizing here? Thanks Best Regards On Fri, Sep 18, 2015 at 11:21 PM, Priya Ch wrote: > Hello All, > > Instead of declaring sparkContext in main, declared as object variable > as - > > object sparkDemo > { > > val conf = new SparkConf > va

SparkContext declared as object variable

2015-09-18 Thread Priya Ch
Hello All, Instead of declaring sparkContext in main, declared as object variable as - object sparkDemo { val conf = new SparkConf val sc = new SparkContext(conf) def main(args:Array[String]) { val baseRdd = sc.parallelize() . . . } } But this piece of code is giving