The TFOCS package is announced here: https://databricks.com/blog/2015/11/02/announcing-the-spark-tfocs-optimization-package.html <https://databricks.com/blog/2015/11/02/announcing-the-spark-tfocs-optimization-package.html>
------------------------------------------------------------------------------- Robin East Spark GraphX in Action Michael Malak and Robin East Manning Publications Co. http://www.manning.com/books/spark-graphx-in-action <http://www.manning.com/books/spark-graphx-in-action> > On 5 Oct 2016, at 08:29, Robin East <robin.e...@xense.co.uk> wrote: > > I would say no, at least not without a fair degree of algorithm writing > experience. MLLib is primarily a set of machine learning algorithms, many of > which are based on implementations of distributed optimisation procedures. > The SAS routines you mention are optimisation routines which don't have > directly comparable implementations in MLLib. One possibility is the TFOCS > spark package which might be of interest but I'm not sure it quite matches > what you are asking for. It's built by databricks so maybe one of their > people can advise. > > One question I would ask is is this a big data problem? Could you use a > python or julia library on a machine with large amounts of RAM? > > Would be interested in hearing the views of others on the forum. > > Sent from my iPhone > >> On 5 Oct 2016, at 05:28, nsareen <nsar...@gmail.com> wrote: >> >> I'm not getting any support in this group, is the question not valid ? need >> someone to reply to this question, we have a huge dependency on SAS which we >> want to eliminate & want to know if spark can help. >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/MLib-Non-Linear-Optimization-tp27645p27835.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> --------------------------------------------------------------------- >> To unsubscribe e-mail: user-unsubscr...@spark.apache.org >> > > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org >