RE: Spark driver CPU usage

2017-03-01 Thread Phadnis, Varun
2017 06:08 To: Phadnis, Varun <phad...@sky.optymyze.com> Cc: user@spark.apache.org Subject: Re: Spark driver CPU usage Use conf spark.task.cpus to control number of cpus to use in a task. On Mar 1, 2017, at 5:41 PM, Phadnis, Varun <phad...@sky.optymyze.com> wrote: > > Hello, > > I

Spark driver CPU usage

2017-03-01 Thread Phadnis, Varun
Hello, Is there a way to control CPU usage for driver when running applications in client mode? Currently we are observing that the driver occupies all the cores. Launching just 3 instances of driver of WordCount sample application concurrently on the same machine brings the usage of its 4

RE: Spark and Kafka integration

2017-01-12 Thread Phadnis, Varun
Cool! Thanks for your inputs Jacek and Mark! From: Mark Hamstra [mailto:m...@clearstorydata.com] Sent: 13 January 2017 12:59 To: Phadnis, Varun <phad...@sky.optymyze.com> Cc: user@spark.apache.org Subject: Re: Spark and Kafka integration See "API compatibility" in http://

Spark and Kafka integration

2017-01-12 Thread Phadnis, Varun
Hello, We are using Spark 2.0 with Kafka 0.10. As I understand, much of the API packaged in the following dependency we are targeting is marked as "@Experimental" org.apache.spark spark-streaming-kafka-0-10_2.11 2.0.0 What are implications of this being marked as experimental?