Hi,

Trying to use GLM with Spark.

I go through the documentation of the same in
http://apache.github.io/incubator-systemml/algorithms-regression.html#generalized-linear-models
I see that inputs like X and Y have to supplied using a file and the file
has to be there in HDFS.

Is this understanding correct ? Can't X and Y be supplied using a Data
Frame from a Spark Context (as in case of example of LinearRegression in
http://apache.github.io/incubator-systemml/mlcontext-programming-guide.html#train-using-systemml-linear-regression-algorithm)
?

Regards,
Sourav

Reply via email to