Hi Xiangrui,

For orthogonality properties in the factors we need a constraint solver
other than the usuals (l1, upper and lower bounds, l2 etc)

The interface of constraint solver is standard and I can add it in mllib
optimization....

But I am not sure how will I call the gpl licensed ipm solver from
mllib....assume the solver interface is as follows:

Qpsolver (densematrix h, array [double] f, int linearEquality, int
linearInequality, bool lb, bool ub)

And then I have functions to update equalities, inequalities, bounds etc
followed by the run which generates the solution....

For l1 constraints I have to use epigraph formulation which needs a
variable transformation before the solve....

I was thinking that for the problems that does not need constraints people
will use ALS.scala and ConstrainedALS.scala will have the constrained
formulations....

I can point you to the code once it is ready and then you can guide me how
to refactor it to mllib als ?

Thanks.
Deb
Hi Deb,

Why do you want to make those methods public? If you only need to
replace the solver for subproblems. You can try to make the solver
pluggable. Now it supports least squares and non-negative least
squares. You can define an interface for the subproblem solvers and
maintain the IPM solver at your own code base, if the only information
you need is Y^T Y and Y^T b.

Btw, just curious, what is the use case for quadratic constraints?

Best,
Xiangrui

On Thu, Jun 5, 2014 at 3:38 PM, Debasish Das <debasish.da...@gmail.com>
wrote:
> Hi,
>
> We are adding a constrained ALS solver in Spark to solve matrix
> factorization use-cases which needs additional constraints (bounds,
> equality, inequality, quadratic constraints)
>
> We are using a native version of a primal dual SOCP solver due to its
small
> memory footprint and sparse ccs matrix computation it uses...The solver
> depends on AMD and LDL packages from Timothy Davis for sparse ccs matrix
> algebra (released under lgpl)...
>
> Due to GPL dependencies, it won't be possible to release the code as
Apache
> license for now...If we get good results on our use-cases, we will plan to
> write a version in breeze/modify joptimizer for sparse ccs operations...
>
> I derived ConstrainedALS from Spark mllib ALS and I am comparing the
> performance with default ALS and non-negative ALS as baseline. Plan is to
> release the code as GPL license for community review...I have kept the
> package structure as org.apache.spark.mllib.recommendation
>
> There are some private functions defined in ALS, which I would like to
> reuse....Is it possible to take the private out from the following
> functions:
>
> 1. makeLinkRDDs
> 2. makeInLinkBlock
> 3. makeOutLinkBlock
> 4. randomFactor
> 5. unblockFactors
>
> I don't want to copy any code.... I can ask for a PR to make these
> changes...
>
> Thanks.
> Deb

Reply via email to