RE: How to set Spark executor memory?

2015-03-16 Thread jishnu.prathap
Hi Xi Shen, You could set the spark.executor.memory in the code itself . new SparkConf()..set("spark.executor.memory", "2g") Or you can try the -- spark.executor.memory 2g while submitting the jar. Regards Jishnu Prathap From: Akhil Das [mailto:ak...@sigmoidanalytics.com] Sent: Monday, March 16

RE: Spark SQL Stackoverflow error

2015-03-10 Thread jishnu.prathap
import com.google.gson.{GsonBuilder, JsonParser} import org.apache.spark.mllib.clustering.KMeans import org.apache.spark.sql.SQLContext import org.apache.spark.{SparkConf, SparkContext} import org.apache.spark.mllib.clustering.KMeans /** * Examine the collected tweets and trains a model based on th

RE: Error KafkaStream

2015-02-05 Thread jishnu.prathap
Hi, If your message is string you will have to Change Encoder and Decoder to StringEncoder , StringDecoder. If your message Is byte[] you can use DefaultEncoder & Decoder. Also Don’t forget to add import statements depending on ur encoder and decoder. import kafka.ser

RE: How to integrate Spark with OpenCV?

2015-01-14 Thread jishnu.prathap
Hi Akhil Thanks for the response Our use case is Object detection in multiple videos. It’s kind of searching an image if present in the video by matching the image with all the frames of the video. I am able to do it in normal java code using OpenCV lib now but I don’t think it is scalable to

Stack overflow Error while executing spark SQL

2014-12-09 Thread jishnu.prathap
Hi I am getting Stack overflow Error Exception in main java.lang.stackoverflowerror scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222) at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254) at scala.util.parsing.combinator.Pars

Stack overflow Error while executing spark SQL

2014-12-09 Thread jishnu.prathap
Hi I am getting Stack overflow Error Exception in main java.lang.stackoverflowerror scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222) at scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Parsers.scala:254) at scala.util.parsing.combinator.Pars

RE: Persist streams to text files

2014-11-21 Thread jishnu.prathap
Hi Thank you ☺Akhil it worked like charm….. I used the file writer outside rdd.foreach that might be the reason for nonserialisable exception…. Thanks & Regards Jishnu Menath Prathap From: Akhil Das [mailto:ak...@sigmoidanalytics.com] Sent: Friday, November 21, 2014 1:15 PM To: Jishnu Menath Prat

RE: Persist streams to text files

2014-11-20 Thread jishnu.prathap
Hi Akhil Thanks for reply But it creates different directories ..I tried using filewriter but it shows non serializable error.. val stream = TwitterUtils.createStream(ssc, None) //, filters) val statuses = stream.map( status => sentimentAnalyzer.findSentiment({ stat

Re: Persist streams to text files

2014-11-20 Thread jishnu.prathap
Hi I am also having similar problem.. any fix suggested.. Originally Posted by GaganBM Hi, I am trying to persist the DStreams to text files. When I use the inbuilt API 'saveAsTextFiles' as : stream.saveAsTextFiles(resultDirectory) this creates a number of subdirectories, for each batch, and w

Is it possible to save the streams to one single file?

2014-11-20 Thread jishnu.prathap
Hi My question is generic: § Is it possible to save the streams to one single file ? if yes can you give me a link or code sample? § I tried using .saveastextfile but its creating different file for each stream. I need to update the same file instead of creating different file for

RE: basic twitter stream program not working.

2014-11-13 Thread jishnu.prathap
Hi Thanks Akhil you saved the day…. Its working perfectly … Regards Jishnu Menath Prathap From: Akhil Das [mailto:ak...@sigmoidanalytics.com] Sent: Thursday, November 13, 2014 3:25 PM To: Jishnu Menath Prathap (WT01 - BAS) Cc: Akhil [via Apache Spark User List]; user@spark.apache

runexample TwitterPopularTags showing Class Not found error

2014-11-13 Thread jishnu.prathap
Hi I am getting the following error while running the TwitterPopularTags example .I am using spark-1.1.0-bin-hadoop2.4 . jishnu@getafix:~/spark/bin$ run-example TwitterPopularTags *** ** ** *** ** spark assembly has been built with Hive, including Datanucleus jars on classpath j

basic twitter stream program not working.

2014-11-13 Thread jishnu.prathap
Hi I am trying to run a basic twitter stream program but getting blank output. Please correct me if I am missing something. import org.apache.spark.SparkConf import org.apache.spark.streaming.StreamingContext import org.apache.spark.streaming.twitter.TwitterUtils import org.apache.spark.st

RE: java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi Sorry for the repeated mails .My post was not accepted by the mailing list due to some problem in postmas...@wipro.com I had to manually send it . Still it was not visible for half an hour.I retried. But later all the post was visible. I deleted it from the page but it was already delivered

Re: java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
No .. I am not passing any argument. I am getting this error while starting the Master The same spark binary i am able to run in another machine ( ubuntu ) installed. The information contained in this electronic message and any attachments to this message are intended for the excl

java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi , I am getting this weird error while starting Worker. -bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker spark://osebi-UServer:59468 Spark assembly has been built with Hive, including Datanucleus jars on classpath 14/09/24 16:22:04 INFO worker.Worker: Registered signal handle

java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi , I am getting this weird error while starting Worker. -bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker spark://osebi-UServer:59468 Spark assembly has been built with Hive, including Datanucleus jars on classpath 14/09/24 16:22:04 INFO worker.Worker: Registered signal handle

RE: java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi , I am getting this weird error while starting Worker. -bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker spark://osebi-UServer:59468 Spark assembly has been built with Hive, including Datanucleus jars on classpath 14/09/24 16:22:04 INFO worker.Worker: Registered signal handl

Re: java.lang.NumberFormatException while starting spark-worker

2014-09-24 Thread jishnu.prathap
Hi , I am getting this weird error while starting Worker. -bash-4.1$ spark-class org.apache.spark.deploy.worker.Worker spark://osebi-UServer:59468 Spark assembly has been built with Hive, including Datanucleus jars on classpath 14/09/24 16:22:04 INFO worker.Worker: Registered signal handle

Unable to change the Ports

2014-09-22 Thread jishnu.prathap
Hi Everyone i am new to spark ... I am posting some basic doubts i met while trying to create a standalone cluster for a small poc ... 1)My Corporate firewall blocked the port 7077, which is the default port of Master URL , So i used start-master.sh --port 8080 (also tried with several other po

[no subject]

2014-09-22 Thread jishnu.prathap
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or