Hello,

I am trying to run the PyDML script below using the Spark ML Context.

import systemml as smlimport numpy as npsml.setSparkContext(sc)m1 =
sml.matrix(np.ones((3,3)) + 2)m2 = sml.matrix(np.ones((3,3)) + 3)m2 =
m1 * (m2 + m1)m4 = 1.0 - m2m4.sum(axis=1).toNumPyArray()


I start Spark Shell and create ML context successfully. Then I load the
script from a file using the following command

val s4 = ScriptFactory.pydmlFromFile("test.pydml")

Finally, I execute the script using

 ml.execute(s4)

The imports are not recognized. I suppose that the first import and setting
the Spark context are not required, since we set up a MLContext after
starting Spark Shell, but what about numpy? I am a bit confused as to what
changes I need to make to run this example.

Thank you in advance for your help,
Nantia

Reply via email to