Hi Nantia,

The example:
> import systemml as smlimport numpy as npsml.setSparkContext(sc)m1 =
> sml.matrix(np.ones((3,3)) + 2)m2 = sml.matrix(np.ones((3,3)) + 3)m2 =
> m1 * (m2 + m1)m4 = 1.0 - m2m4.sum(axis=1).toNumPyArray()

is a Python example (not PyDML) and works on only on pyspark not scala Spark 
shell. Just do 'pip install systemml' and start pyspark shell and paste above 
code.

The PyDML script does not require any imports nor does it support python 
packages such as numpy. Here is an equivalent PyDML script 
m1 = full(1, rows=3, cols =3) +2
m2 = full(1, rows=3, cols =3) +3
m2= m1 * (m2 + m1)
...

> val s4 = ScriptFactory.pydmlFromFile("test.pydml")
> ml.execute(s4)

On other hand is a scala code and will run on scala spark shell (which I 
believe you invoked)

It would help if you send the error message as I am guessing the setup.

Thanks,

Niketan 

> On Nov 8, 2016, at 2:49 AM, Nantia Makrynioti <nantiam...@gmail.com> wrote:
> 
> Hello,
> 
> I am trying to run the PyDML script below using the Spark ML Context.
> 
> import systemml as smlimport numpy as npsml.setSparkContext(sc)m1 =
> sml.matrix(np.ones((3,3)) + 2)m2 = sml.matrix(np.ones((3,3)) + 3)m2 =
> m1 * (m2 + m1)m4 = 1.0 - m2m4.sum(axis=1).toNumPyArray()
> 
> 
> I start Spark Shell and create ML context successfully. Then I load the
> script from a file using the following command
> 
> val s4 = ScriptFactory.pydmlFromFile("test.pydml")
> 
> Finally, I execute the script using
> 
> ml.execute(s4)
> 
> The imports are not recognized. I suppose that the first import and setting
> the Spark context are not required, since we set up a MLContext after
> starting Spark Shell, but what about numpy? I am a bit confused as to what
> changes I need to make to run this example.
> 
> Thank you in advance for your help,
> Nantia

Reply via email to