Hi,

In spark streaming job i had the following setting

            this.jsc.getConf().set("spark.driver.maxResultSize", “0”);
and i got the error in the job as below

User class threw exception: Job aborted due to stage failure: Total size of
serialized results of 120 tasks (1082.2 MB) is bigger than
spark.driver.maxResultSize (1024.0 MB)

Basically i realized that as default value is 1 GB. I changed
the configuration as below.

this.jsc.getConf().set("spark.driver.maxResultSize", “2g”);

and when i ran the job it gave the error

User class threw exception: Job aborted due to stage failure: Total size of
serialized results of 120 tasks (1082.2 MB) is bigger than
spark.driver.maxResultSize (1024.0 MB)

So, basically the change i made is not been considered in the job. so my
question is

- "spark.driver.maxResultSize", “2g” is this the right way to change or any
other way to do it.
- Is this a bug in spark 1.3 or something or any one had this issue before?

Reply via email to