Spark Mesos Dispatcher

2015-07-19 Thread Jahagirdar, Madhu
All, Can we run different version of Spark using the same Mesos Dispatcher. For example we can run drivers with Spark 1.3 and Spark 1.4 at the same time ? Regards, Madhu Jahagirdar The information contained in this message may be confidential and legally

RE: Spark Mesos Dispatcher

2015-07-19 Thread Jahagirdar, Madhu
, Madhu Cc: user; d...@spark.apache.org Subject: Re: Spark Mesos Dispatcher Yes. Sent from my iPhone On 19 Jul, 2015, at 10:52 pm, Jahagirdar, Madhu madhu.jahagir...@philips.commailto:madhu.jahagir...@philips.com wrote: All, Can we run different version of Spark using the same Mesos Dispatcher

Spark Drill 1.2.1 - error

2015-02-26 Thread Jahagirdar, Madhu
All, We are getting the below error when we are using Drill JDBC driver with spark, please let us know what could be the issue. java.lang.IllegalAccessError: class io.netty.buffer.UnsafeDirectLittleEndian cannot access its superclass io.netty.buffer.WrappedByteBuf at

Re: Can we say 1 RDD is generated every batch interval?

2014-12-30 Thread Jahagirdar, Madhu
Foreach iterates through the partitions in the RDD and executes the operations for each partitions i guess. On 29-Dec-2014, at 10:19 pm, SamyaMaiti samya.maiti2...@gmail.com wrote: Hi All, Please clarify. Can we say 1 RDD is generated every batch interval? If the above is true. Then, is

RE: CheckPoint Issue with JsonRDD

2014-11-07 Thread Jahagirdar, Madhu
Michael any idea on this? From: Jahagirdar, Madhu Sent: Thursday, November 06, 2014 2:36 PM To: mich...@databricks.com; user Subject: CheckPoint Issue with JsonRDD When we enable checkpoint and use JsonRDD we get the following error: Is this bug

CheckPoint Issue with JsonRDD

2014-11-06 Thread Jahagirdar, Madhu
When we enable checkpoint and use JsonRDD we get the following error: Is this bug ? Exception in thread main java.lang.NullPointerException at org.apache.spark.rdd.RDD.init(RDD.scala:125) at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:103)

Dynamically InferSchema From Hive and Create parquet file

2014-11-05 Thread Jahagirdar, Madhu
Currently the createParquetMethod needs BeanClass as one of the parameters. javahiveContext.createParquetFile(XBean.class, IMPALA_TABLE_LOC, true, new Configuration())

Issue with Spark Twitter Streaming

2014-10-13 Thread Jahagirdar, Madhu
All, We are using Spark Streaming to receive data from twitter stream. This is running behind proxy. We have done the following configurations inside spark steaming for twitter4j to work behind proxy. def main(args: Array[String]) { val filters = Array(Modi)

RE: Dstream Transformations

2014-10-06 Thread Jahagirdar, Madhu
From: Akhil Das [ak...@sigmoidanalytics.com] Sent: Monday, October 06, 2014 1:20 PM To: Jahagirdar, Madhu Cc: user Subject: Re: Dstream Transformations AFAIK spark doesn't restart worker nodes itself. You can have multiple worker nodes and in that case if one worker node goes down, then spark

RE: Dstream Transformations

2014-10-06 Thread Jahagirdar, Madhu
To: Jahagirdar, Madhu Cc: Akhil Das; user Subject: Re: Dstream Transformations From the Spark Streaming Programming Guide (http://spark.apache.org/docs/latest/streaming-programming-guide.html#failure-of-a-worker-node): ...output operations (like foreachRDD) have at-least once semantics

hdfs short circuit

2014-07-03 Thread Jahagirdar, Madhu
can i enable spark to use dfs.client.read.shortcircuit property to improve performance and ready natively on local nodes instead of hdfs api ? The information contained in this message may be confidential and legally protected under applicable law. The message