UnresolvedAddressException in Kubernetes Cluster

2017-10-09 Thread Suman Somasundar
Hi, I am trying to deploy a Spark app in a Kubernetes Cluster. The cluster consists of 2 machines - 1 master and 1 slave, each of them with the following config: RHEL 7.2 Docker 17.03.1 K8S 1.7. I am following the steps provided in

Log messages for shuffle phase

2016-08-11 Thread Suman Somasundar
Hi, While going through the logs of an application, I noticed that I could not find any logs to dig deeper into any of the shuffle phases. I am interested in finding out time taken by each shuffle phase, the size of data spilled to disk if any, among other things. Does anyone know

Running 2 spark application in parallel

2015-10-22 Thread Suman Somasundar
Hi all, Is there a way to run 2 spark applications in parallel under Yarn in the same cluster? Currently, if I submit 2 applications, one of them waits till the other one is completed. I want both of them to start and run at the same time. Thanks, Suman.

Connection closed error while running Terasort

2015-08-31 Thread Suman Somasundar
Hi, I am getting the following error while trying to run a 10GB terasort under Yarn with 8 nodes. The command is: spark-submit --class com.github.ehiggs.spark.terasort.TeraSort --master yarn-cluster --num-executors 10 --executor-memory 32g

Restricting the number of iterations in Mllib Kmeans

2015-05-14 Thread Suman Somasundar
Hi,, I want to run a definite number of iterations in Kmeans. There is a command line argument to set maxIterations, but even if I set it to a number, Kmeans runs until the centroids converge. Is there a specific way to specify it in command line? Also, I wanted to know if we can supply

How to increase parallelism in Yarn

2014-12-18 Thread Suman Somasundar
Hi, I am using Spark 1.1.1 on Yarn. When I try to run K-Means, I see from the Yarn dashboard that only 3 containers are being used. How do I increase the number of containers used? P.S: When I run K-Means on Mahout with the same settings, I see that there are 25-30 containers being

Re: Invalid Class Exception

2014-06-04 Thread Suman Somasundar
build Spark yourself you need to do it with Java 6 — it’s a known issue because of the way Java 6 and 7 package JAR files. But I haven’t seen it result in this particular error. Matei On Jun 3, 2014, at 5:18 PM, Suman Somasundar suman.somasun...@oracle.com wrote: Hi all, I get the following

Re: Invalid Class Exception

2014-06-04 Thread Suman Somasundar
, Suman. On 6/4/2014 10:48 AM, Suman Somasundar wrote: I am building Spark by myself and I am using Java 7 to both build and run. I will try with Java 6. Thanks, Suman. On 6/3/2014 7:18 PM, Matei Zaharia wrote: What Java version do you have, and how did you get Spark (did you build it yourself

Invalid Class Exception

2014-06-03 Thread Suman Somasundar
Hi all, I get the following exception when using Spark to run example k-means program. I am using Spark 1.0.0 and running the program locally. java.io.InvalidClassException: scala.Tuple2; invalid descriptor for field _1 at

Re: Invalid Class Exception

2014-05-28 Thread Suman Somasundar
On 5/27/2014 1:28 PM, Marcelo Vanzin wrote: On Tue, May 27, 2014 at 1:05 PM, Suman Somasundar suman.somasun...@oracle.com wrote: I am running this on a Solaris machine with logical partitions. All the partitions (workers) access the same Spark folder. Can you check whether you have multiple

Re: Invalid Class Exception

2014-05-27 Thread Suman Somasundar
cluster? If so, one way to fix this is to run the following on the master node: /root/spark-ec2/copy-dir --delete /root/spark This syncs all of Spark across your cluster, configs, jars and everything. 2014-05-23 15:20 GMT-07:00 Suman Somasundar suman.somasun...@oracle.com mailto:suman.somasun

Invalid Class Exception

2014-05-23 Thread Suman Somasundar
Hi, I get the following exception when using Spark to run various programs. java.io.InvalidClassException: org.apache.spark.SerializableWritable; local class incompatible: stream classdesc serialVersionUID = 6301214776158303468, local class serialVersionUID = -7785455416944904980 at