Re: mutable.LinkedHashMap kryo serialization issues

2016-08-26 Thread Rahul Palamuttam
utable.LinkedHashMap? Should I file a JIRA for it? Much credit should be given to Martin Grotzke from EsotericSoftware/kryo who helped me tremendously. Best, Rahul Palamuttam On Fri, Aug 26, 2016 at 10:16 AM, Rahul Palamuttam wrote: > Thanks Renato. > > I forgot to reply all last t

Re: mutable.LinkedHashMap kryo serialization issues

2016-08-26 Thread Rahul Palamuttam
ed it is null. The iterator requires the firstEntry variable to walk the LinkedHashMap https://github.com/scala/scala/blob/v2.11.8/src/library/scala/collection/mutable/LinkedHashMap.scala#L94-L100 I wonder why these two variables were made transient. Best, Rahul Palamuttam On Thu, Aug 25, 2016 a

mutable.LinkedHashMap kryo serialization issues

2016-08-22 Thread Rahul Palamuttam
askResultGetter$$anon$2.run(TaskResultGetter.scala:50) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) I hope th

Re: Unsubscribe

2016-08-21 Thread Rahul Palamuttam
Hi sudhanshu, Try user-unsubscribe.spark.apache.org - Rahul P Sent from my iPhone > On Aug 21, 2016, at 9:19 AM, Sudhanshu Janghel > wrote: > > Hello, > > I wish to unsubscribe from the channel. > > KIND REGARDS, > SUDHANSHU

mutable.LinkedHashMap kryo serialization issues

2016-08-20 Thread Rahul Palamuttam
ent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) I hope this is a known issue and/or I'm missing something important in my setup. Appreciate any help or advice! Best, Rahul Palamuttam

Re: Renaming sc variable in sparkcontext throws task not serializable

2016-03-02 Thread Rahul Palamuttam
wrote: > I can reproduce it in spark-shell. But it works for batch job. Looks like > spark repl issue. > > On Thu, Mar 3, 2016 at 10:43 AM, Rahul Palamuttam > wrote: > >> Hi All, >> >> We recently came across this issue when using the spark-shell and >>

Renaming sc variable in sparkcontext throws task not serializable

2016-03-02 Thread Rahul Palamuttam
Hi All, We recently came across this issue when using the spark-shell and zeppelin. If we assign the sparkcontext variable (sc) to a new variable and reference another variable in an RDD lambda expression we get a task not serializable exception. The following three lines of code illustrate this

Re: Support of other languages?

2015-09-17 Thread Rahul Palamuttam
ere is a recent JIRA which I thought was interesting with respect to our discussion. https://issues.apache.org/jira/browse/SPARK-10399t JIRA There's also a suggestion, at the bottom of the JIRA, that considers exposing on-heap memory which is pretty interesting. - Rahul Palamuttam On Wed, Se

Support of other languages?

2015-09-07 Thread Rahul Palamuttam
;s in performance or in general software architecture. With python in particular the collect operation must be first written to disk and then read back from the python driver process. Would appreciate any insight on this, and if there is any work happening in this area. Thank you, Rahul Palamutt

Re: Spark build/sbt assembly

2015-07-30 Thread Rahul Palamuttam
ks > Best Regards > > On Tue, Jul 28, 2015 at 12:08 AM, Rahul Palamuttam > wrote: > >> Hi All, >> >> I hope this is the right place to post troubleshooting questions. >> I've been following the install instructions and I get the following error &g

Spark Number of Partitions Recommendations

2015-07-28 Thread Rahul Palamuttam
Hi All, I was wondering why the recommended number for parallelism was 2 -3 times the number of cores on your cluster. Is the heuristic explained in any of the Spark papers? Or is it more of an agreed upon rule of thumb? Thanks, Rahul P -- View this message in context: http://apache-spark-us

Re: Spark build/sbt assembly

2015-07-27 Thread Rahul Palamuttam
, Jul 27, 2015 at 11:48 AM, Rahul Palamuttam wrote: > All nodes are using java 8. > I've tried to mimic the environments as much as possible among all nodes. > > > On Mon, Jul 27, 2015 at 11:44 AM, Ted Yu wrote: > >> bq. on one node it works but on the other it gives

Re: Spark build/sbt assembly

2015-07-27 Thread Rahul Palamuttam
nts on the two nodes ? > Does the other node use Java 8 ? > > Cheers > > On Mon, Jul 27, 2015 at 11:38 AM, Rahul Palamuttam > wrote: > >> Hi All, >> >> I hope this is the right place to post troubleshooting questions. >> I've been following the in

Spark build/sbt assembly

2015-07-27 Thread Rahul Palamuttam
Hi All, I hope this is the right place to post troubleshooting questions. I've been following the install instructions and I get the following error when running the following from Spark home directory $./build/sbt Using /usr/java/jdk1.8.0_20/ as default JAVA_HOME. Note, this will be overridden b