I believe you want to set memoryFraction higher, not lower. These two
older threads seem to have similar issues you are experiencing:
https://mail-archives.apache.org/mod_mbox/spark-user/201503.mbox/%3CCAHUQ+_ZqaWFs_MJ=+V49bD2paKvjLErPKMEW5duLO1jAo4=d...@mail.gmail.com%3E
Hi Dennis,
On Wed, Jun 15, 2016 at 11:39 PM, Dennis Lovely wrote:
> You could try tuning spark.shuffle.memoryFraction and
> spark.storage.memoryFraction (both of which have been deprecated in 1.6),
> but ultimately you need to find out where you are bottlenecked and address
>
On Thu, Jun 16, 2016 at 5:27 AM, Deepak Goel wrote:
> What is your hardware configuration like which you are running Spark on?
>
> It is 24core, 128GB RAM
> Hey
>
> Namaskara~Nalama~Guten Tag~Bonjour
>
>
>--
> Keigu
>
> Deepak
> 73500 12833
> www.simtree.net,
Hi,
>
> What do you see under Executors and Details for Stage (for the
> affected stages)? Anything weird memory-related?
>
Under executor Tab, logs throw these warning -
16/06/16 20:45:40 INFO TorrentBroadcast: Reading broadcast variable
422145 took 1 ms
16/06/16 20:45:40 WARN MemoryStore:
Hi,
What do you see under Executors and Details for Stage (for the
affected stages)? Anything weird memory-related?
How does your "I am reading data from Kafka into Spark and writing it
into Cassandra after processing it." pipeline look like?
Pozdrawiam,
Jacek Laskowski
You could try tuning spark.shuffle.memoryFraction and
spark.storage.memoryFraction (both of which have been deprecated in 1.6),
but ultimately you need to find out where you are bottlenecked and address
that as adjusting memoryFraction will only be a stopgap. both shuffle and
storage
Hi,
I did set --driver-memory 4G. I still run into this issue after 1 hour of
data load.
I also tried version 1.6 in test environment. I hit this issue much faster
than in 1.5.1 setup.
LCassa
On Tue, Jun 14, 2016 at 3:57 PM, Gaurav Bhatnagar
wrote:
> try setting the
Hi,
Upgrading sprak is not option right now. I did set --driver-memory 4G. I
still run into this issue after 1 hour of data load.
LCassa
On Tue, Jun 14, 2016 at 3:57 PM, Gaurav Bhatnagar
wrote:
> try setting the option --driver-memory 4G
>
> On Tue, Jun 14, 2016 at 3:52
try setting the option --driver-memory 4G
On Tue, Jun 14, 2016 at 3:52 PM, Ben Slater
wrote:
> A high level shot in the dark but in our testing we found Spark 1.6 a lot
> more reliable in low memory situations (presumably due to
>
A high level shot in the dark but in our testing we found Spark 1.6 a lot
more reliable in low memory situations (presumably due to
https://issues.apache.org/jira/browse/SPARK-1). If it’s an option,
probably worth a try.
Cheers
Ben
On Wed, 15 Jun 2016 at 08:48 Cassa L
Hi,
I would appreciate any clue on this. It has become a bottleneck for our
spark job.
On Mon, Jun 13, 2016 at 2:56 PM, Cassa L wrote:
> Hi,
>
> I'm using spark 1.5.1 version. I am reading data from Kafka into Spark and
> writing it into Cassandra after processing it. Spark
11 matches
Mail list logo