Hello, I have been running some performance tests on AWS, EMR Cluster. [1 master, 2 slaves] instance type m3.xlarge.
Incase you want to run some performance workloads on 1.0.0 release please follow the steps below. - Set SYSTEMML_HOME, SPARK_HOME environment variables in your cluster - Download and extract the release (systemml-1.0.0-bin.tgz) - The following extra files need to be copied to your systemml-1.0.0-bin folders /home/hadoop/systemml/conf/* /home/hadoop/systemml/bin/* /home/hadoop/systemml/scripts/perftest/extractTestData.dml Using the jars in the lib folder gave me an error (Not sure why), I had to manually build a latest version and copy SystemML.jar to the S ystemml-1.0.0-bin/target folder. /home/hadoop/systemml/target/* Make sure that your temp folders are empty rm -r /home/hadoop/systemml-1.0.0-bin/temp_perftest/ hdfs dfs -rmr /user/hadoop/* And finally ./run_perftest.py (hybrid-spark) Will run all your test suit for all algorithms on 10MB data. ./run_perftest.py --config-dir /home/hadoop/systemml-1.0.0-bin/temp --temp-dir /home/hadoop/systemml-1.0.0-bin/temp --file-system-type local --exec-type singlenode (singlenode) Regards, Krishna
