Job with spark

2015-06-17 Thread Sergio Jiménez Barrio
I am student of telecommunications engineering and this year I worked with spark. It is a world that I like and want to know if this job having in this area. Thanks for all Regards

Fwd: Re: How to keep a SQLContext instance alive in a spark streaming application's life cycle?

2015-06-10 Thread Sergio Jiménez Barrio
Note: CCing user@spark.apache.org First, you must check if the RDD is empty: messages.foreachRDD { rdd = if (!rdd.isEmpty) { }} Now, you can obtain the instance of a SQLContext: val sqlContext = SQLContextSingleton.getInstance(rdd.sparkContext)

Spark streaming closes with Cassandra Conector

2015-05-09 Thread Sergio Jiménez Barrio
I am trying save some data in Cassandra in app with spark Streaming: Messages.foreachRDD { . . . CassandraRDD.saveToCassandra(test,test) } When I run, the app is closes when I recibe data or can't connect with Cassandra. Some idea? Thanks -- Atte. Sergio Jiménez

How update counter in cassandra

2015-05-06 Thread Sergio Jiménez Barrio
I have a Counter family colums in Cassandra. I want update this counters with a aplication in spark Streaming. How can I update counter cassandra with Spark? Thanks.

AJAX with Apache Spark

2015-05-04 Thread Sergio Jiménez Barrio
Hi, I am trying create a DashBoard of a job of Apache Spark. I need run Spark Streaming 24/7 and when recive a ajax request this answer with the actual state of the job. I have created the client, and the program in Spark. I tried create the service of response with play, but this run the

Re: Convert DStream[Long] to Long

2015-04-25 Thread Sergio Jiménez Barrio
Regards On Fri, Apr 24, 2015 at 11:20 PM, Sergio Jiménez Barrio drarse.a...@gmail.com wrote: Hi, I need compare the count of messages recived if is 0 or not, but messages.count() return a DStream[Long]. I tried this solution: val cuenta = messages.count().foreachRDD{ rdd

Re: Convert DStream[Long] to Long

2015-04-24 Thread Sergio Jiménez Barrio
no data so far but may have data in the future. That's why I say you can count records received to date. On Fri, Apr 24, 2015 at 1:57 PM, Sergio Jiménez Barrio drarse.a...@gmail.com wrote: My problem is that I need know if I have a DStream with data. If in this second I didn't recive data, I

Re: Convert DStream[Long] to Long

2015-04-24 Thread Sergio Jiménez Barrio
But if a use messages.count().print this show a single number :/ 2015-04-24 20:22 GMT+02:00 Sean Owen so...@cloudera.com: It's not a Long. it's an infinite stream of Longs. On Fri, Apr 24, 2015 at 2:20 PM, Sergio Jiménez Barrio drarse.a...@gmail.com wrote: It isn't the sum. This is de code

Re: Convert DStream to DataFrame

2015-04-24 Thread Sergio Jiménez Barrio
for all! 2015-04-23 10:29 GMT+02:00 Sergio Jiménez Barrio drarse.a...@gmail.com: Thank you ver much, Tathagata! El miércoles, 22 de abril de 2015, Tathagata Das t...@databricks.com escribió: Aaah, that. That is probably a limitation of the SQLContext (cc'ing Yin for more information

Convert DStream[Long] to Long

2015-04-24 Thread Sergio Jiménez Barrio
Hi, I need compare the count of messages recived if is 0 or not, but messages.count() return a DStream[Long]. I tried this solution: val cuenta = messages.count().foreachRDD{ rdd = rdd.first() } But

Re: Convert DStream to DataFrame

2015-04-23 Thread Sergio Jiménez Barrio
Thank you ver much, Tathagata! El miércoles, 22 de abril de 2015, Tathagata Das t...@databricks.com escribió: Aaah, that. That is probably a limitation of the SQLContext (cc'ing Yin for more information). On Wed, Apr 22, 2015 at 7:07 AM, Sergio Jiménez Barrio drarse.a...@gmail.com

Re: Convert DStream to DataFrame

2015-04-22 Thread Sergio Jiménez Barrio
about sqlcontext.createDataframe(rdd)? On 22 Apr 2015 23:04, Sergio Jiménez Barrio drarse.a...@gmail.com wrote: Hi, I am using Kafka with Apache Stream to send JSON to Apache Spark: val messages = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams

Re: From DataFrame to LabeledPoint

2015-04-07 Thread Sergio Jiménez Barrio
. On Mon, Apr 6, 2015 at 6:53 AM, Sergio Jiménez Barrio drarse.a...@gmail.com wrote: Hi!, I had tried your solution, and I saw that the first row is null. This is important? Can I work with null rows? Some rows have some columns with null values. This is the first row of Dataframe: scala