Hi SystemML folks,

Are there any recommended Spark configurations when running SystemML on a
single machine? I.e. is there a difference between launching Spark with
master=local[*] and running SystemML as a standard process in the JVM as
opposed to launching a single node spark cluster? If the latter, is there a
recommended balance between driver and executor memory?

Thanks,

Anthony

Reply via email to