7 06:08
To: Phadnis, Varun
Cc: user@spark.apache.org
Subject: Re: Spark driver CPU usage
Use conf spark.task.cpus to control number of cpus to use in a task.
On Mar 1, 2017, at 5:41 PM, Phadnis, Varun wrote:
>
> Hello,
>
> Is there a way to control CPU usage for driver when running app
Hello,
Is there a way to control CPU usage for driver when running applications in
client mode?
Currently we are observing that the driver occupies all the cores. Launching
just 3 instances of driver of WordCount sample application concurrently on the
same machine brings the usage of its 4 cor
Cool! Thanks for your inputs Jacek and Mark!
From: Mark Hamstra [mailto:m...@clearstorydata.com]
Sent: 13 January 2017 12:59
To: Phadnis, Varun
Cc: user@spark.apache.org
Subject: Re: Spark and Kafka integration
See "API compatibility" in http://spark.apache.org/versioning-policy.h
Hello,
We are using Spark 2.0 with Kafka 0.10.
As I understand, much of the API packaged in the following dependency we are
targeting is marked as "@Experimental"
org.apache.spark
spark-streaming-kafka-0-10_2.11
2.0.0
What are implications of this being marked as experimental? A