Hey Ryan,

Sorry for the ambiguity. Basically what  I did was just a shortlist of the 
major past topics related to LRSDP on Github. I didn't realise that the sparse 
and dense constraint is already implemented, which is great.


By "detail constraints" I basically want to say is that it would be helpful to 
have some dataset that is guaranteed to not have any saddle point, i.e. having 
only local and global minima so that we can check the convergence of the LRSDP 
on problems not involving saddle point. Maybe it shouldn't be implemented as  a 
constraint but just as something to keep in mind during the generation of the 
data.


since sparse and dense constraints have already been implemented in MUV SDP 
then yeah I think a variadic template wouldn't be quite of help in this case. I 
haven't give much thought to this over the past few days as I was caught up 
with coursework and hackathon, but I will look further into the code base and 
give a detailed suggestion on whether it is possible to further constraint the 
question or pass on some useful information to the SDP class this week.


Best Regards,

Daniel Li

________________________________
From: Ryan Curtin <r...@ratml.org>
Sent: 02 April 2018 16:18:17
To: LI Xuran
Cc: mlpack@lists.mlpack.org
Subject: Re: [mlpack] MVU Bug Fix GSOC

On Mon, Mar 26, 2018 at 08:23:20PM +0000, LI Xuran wrote:
> >I took a quick look at your proposal and I think it is relatively clear
> >and sufficiently detailed.  I am not clear on exactly what you mean by
> >"5.  it might also be useful to write an algorithm to pre-process the
> >dataset to make it smoother and convex"---note that the LRSDP algorithm
> >breaks the convexity of SDPs and it is a nonconvex optimization.
>
> Thanks a lot for the feedback! Yeah, you are right about the LRSDP is
> a nonconvex optimization, what I intend to say is that during the
> process of sample generation, it might be helpful if we can have some
> constraints (say convex and closed? I am not too sure about the detail
> constraints yet but there is a theorem in the paper of local minima
> specifying it )  on the data to guarantee that an optimal solution can
> always be reached on the sample created, both by LRSDP and a dual
> solver.
>
> I 've gone through the rest of papers and Posts about LRSDP on Github
> during the past few days, as I am interested in what effort have been
> made to debug it. Do you think the following  ideas would be helpful
> to the debug project:
>
>  1.  refactor the SDP class to allow detail constraints specified in input.
>  2.  create variable template to specify linear/sparse/dense
>  constraints on input or A(constraint matrix) and support evaluation
>  of Tr(A_i * UU^T)
>
>
> I didn't add these two to my proposal, but if you think implementing
> those would be useful than I can also look into it and take it as a
> part of the project. Maybe I can try to approach it during the
> community bound.

Hi Daniel,

I realize my response is past the deadline here, so the proposal can't
be modified.  Still I did want to ask for a little clarification of what
you mean to be sure I understand correctly.

For the refactoring you are proposing, can you tell me what more of what
you mean by "allow detail constraints specified in input"?

I think the MVU SDP can be expressed only with sparse and dense
constraints, which we already have implemented.  So I'm not sure what
advantage using variadic templates would get us; can you clarify that
please?

Thanks,

Ryan

--
Ryan Curtin    | "Happy premise #2: There is no giant foot trying
r...@ratml.org | to squash me." - Kit Ramsey
The University of Edinburgh is a charitable body, registered in
Scotland, with registration number SC005336.
_______________________________________________
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Reply via email to