Re: Incremently load big RDD file into Memory

2015-04-09 Thread MUHAMMAD AAMIR
.1001560.n3.nabble.com/Incremently-load-big-RDD-file-into-Memory-tp22410.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional

Re: Incremently load big RDD file into Memory

2015-04-08 Thread Guillaume Pitel
but when i try to use cartesienProduct it got stuck i.e. val count =cartesienProduct.count() Any help to efficiently do this will be highly appreciated. -- View this message in context:http://apache-spark-user-list.1001560.n3.nabble.com/Incremently-load-big-RDD-file

Incremently load big RDD file into Memory

2015-04-07 Thread mas
perfectly fine uptill here but when i try to use cartesienProduct it got stuck i.e. val count =cartesienProduct.count() Any help to efficiently do this will be highly appreciated. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Incremently-load-big-RDD-file

RE: Incremently load big RDD file into Memory

2015-04-07 Thread java8964
...@gmail.com To: user@spark.apache.org Subject: Incremently load big RDD file into Memory val locations = filelines.map(line = line.split(\t)).map(t = (t(5).toLong, (t(2).toDouble, t(3).toDouble))).distinct().collect() val cartesienProduct=locations.cartesian(locations).map(t= Edge(t._1._1,t._2._1