On Sat, Apr 2, 2016 at 9:34 PM, Matthias Boehm wrote:
>
>
> thanks Deron for initiating the discussion around the rework of our
> MLContext API (https://issues.apache.org/jira/browse/SYSTEMML-593). Here
> are a couple of thoughts:
>
> (1) Simplicity: Given that the primary
thanks again for catching
https://issues.apache.org/jira/browse/SYSTEMML-609, yes the change is in
SystemML head now, so please rebuild SystemML or use one of our nightly
builds (https://sparktc.ibmcloud.com/repo/latest/). Thanks.
For running SystemML on Spark, you have multiple options (
just to clarify, the configuration 'scratch' (remote tmp working directory)
is a user-defined configuration coming out of SystemML-config.xml with
internal default set to ./scratch_space if not specified and it is always
accessed as dfs (which depending on your hadoop configuration might use