Hi ,
I am using spark-sql shell wile launching I am running it as spark-sql
--conf spark.driver.maxResultSize=20g
I tried using spark-sql --conf "spark.driver.maxResults"="20g" but still no
luck do I need to use set command something like
spark-sql --conf set "spark.driver.maxReults"="20g"
Hi Karthik ,
You must set the value before the SparkContext (sc) is created. Also don't
assign too much overhead like 20g for maxResultSize , You can set it to 2G
maximum as per your error message.
Also if you are using Java 1.8 , Please add the below section in your
Yarn-site.xml
yarn
Hi Kali,
In the shuffle stage maximum memory is 2GB(1024 MB).in your error it is
expecting more memory.can you let me know your cluster config details.
Thanks & Regards
Kishore M
> On 01-Jun-2016, at 9:11 PM, "kali.tumm...@gmail.com"
> wrote:
>
> Hi All ,
>
> I am getting spark driver memo
Hi All ,
I am getting spark driver memory issue even after overriding the conf by
using --conf spark.driver.maxResultSize=20g and I also mentioned in my sql
script (set spark.driver.maxResultSize =16;) but still the same error
happening.
Job aborted due to stage failure: Total size of serialized