Hi, Can you share the exception?
You need to give the value as well right after --driver-memory. First
preference goes to the config keyval pairs defined in spark-submit and then
only to spark-defaults.con.
You can refer docs for the exact variable name
Thanks,
Prem
On Tue, Jun 19, 2018 at 5:47 P
I have a spark cluster containing 3 nodes and my application is a jar file
running by java -jar .
How can i set driver.memory for my application?
spark-defaults.conf only would be read by ./spark-summit
"java --driver-memory -jar " fails with exception.
Sent using Zoho Mail