UNSUBSCRIBE Thanks
_
The information transmitted in this message and its attachments (if any) is
intended
only for the person or entity to which it is addressed.
The message may contain confidential and/or privileged material.
Have a look at creating a scheduler allocation file with fair scheduling.
FAIR
1
2
FAIR
1
2
Set the following:
def settingsMap = Map(("spark.scheduler.allocation.file",
schedulerAllocationFile),
("spark.scheduler.mode"
Hi Jacek,
To run a spark master on my windows box, I've created a .bat file with contents
something like:
.\bin\spark-class.cmd org.apache.spark.deploy.master.Master --host
For the worker:
.\bin\spark-class.cmd org.apache.spark.deploy.worker.Worker spark://:7077
To wrap these in services,
Hi LCassa,
Try:
Map to pair, then reduce by key.
The spark documentation is a pretty good reference for this & there are plenty
of word count examples on the internet.
Warm regards,
TimB
From: Cassa L [mailto:lcas...@gmail.com]
Sent: Thursday, 19 November 2015 11:27 AM
To: user
Subject: how
If you are running a local context, could it be that you should use:
provided
?
Thanks,
Tim
From: swetha kasireddy [mailto:swethakasire...@gmail.com]
Sent: Wednesday, 18 November 2015 2:01 PM
To: Tathagata Das
Cc: user
Subject: Re: Streaming Job gives error after changing to version
Hi,
I have a spark kafka streaming application that works when I run with a local
spark context, but not with a remote one.
My code consists of:
1. A spring-boot application that creates the context
2. A shaded jar file containing all of my spark code
On my pc (windows), I have a spar