Dear Dr. Pasechnik and Dr. Forets, I am Yash Patel, pursuing M.Sc. in CS at the University of Bonn, and my primary area of focus is approximation algorithms, combinatorial optimization and complexity theory. I am currently self-educating myself in the area of SOS optimization by reading the course material <http://www.sumofsquares.org/public/index.html> available online, as I am strongly aligned towards doing my master thesis on this topic. I am fascinated by its ability capture modern algorithmic viewpoint which is amenable to computation and deeply rooted in semidefinite programming.
I would like to extend the module of semidefinite programming (SDP) by contributing to the Sagemath community by making it as my GSOC project. Polynomial optimization problem is NP-hard, several researchers have proposed approximation procedures by semidefinite relaxations. Such relaxations can be constructed by representing non-negative polynomials as SOS polynomials and dual theory of moments (moment matrices). Hence, it is pivotal to implement the aforementioned representations which would serve as a great deliverable to the SDP module. Secondly, it would be awesome to interface more SDP solvers (like SDPA, SCS, Ncpol2spda, SPDT3, Mosek, Gurobi, Xpress, SCIP-SDP) to Sagemath, as it would make the module robust and allow the Sagemath community to use them according to their specific needs. Though a bunch of SDP solvers use preconditioning to enhance numerical stability, there might be a possibility of inaccuracy due to large input coefficients, which can occur even after preconditioning. This is called ill-conditioning, which makes difficult to obtain a suboptimal solution with off-the-shelf solvers. For example, Roux et al. <https://hal.archives-ouvertes.fr/hal-01358703/document> show that SPDT3 is more robust to ill-conditioning than SeDuMi for their example. Hence, implementing the prototype of Klerk et al. <https://arxiv.org/pdf/1507.03549.pdf>, which uses the short step primal interior point method (IPM) by incorporating Diophantine approximation at each iteration to bound the intermediate bit-sizes of iterates. This methodology could potentially enhance the performance of the solver. It could also be interesting to see how would the solver performs for more practical variants of IPM. For example, one could also implement the idea of applying inner-iteration preconditioned Krylov subspace method to IPM proposed by Cui et al. <https://arxiv.org/pdf/1604.07491.pdf> to overcome the ill-conditioning problem. Currently, their code is tuned solely for LP but further investigation can be done to extend their method to quadratic programming and SDP. I look forward hearing from either of you for further guidance on what should I be doing other than getting used to the codebase (especially the SDP module) and git. Sincerely, Yash *The best theory is inspired by practice. The best practice is inspired by theory. -Donald Knuth* -- You received this message because you are subscribed to the Google Groups "sage-gsoc" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/sage-gsoc. For more options, visit https://groups.google.com/d/optout.
