First-order Optimization Methods for Large-scale Problems is coming at
01/08/2018 - 4:00pm

LPSC 125
Mon, 01/08/2018 - 4:00pm

Hoi-To Wai
Postdoctoral Fellow, Arizona State University

Abstract:
Large-scale problems, where the problem dimension is high or the amount of
data involved is huge, are often encountered in today's machine learning and
information processing applications. This talk presents two recent
first-order methods for large-scale problems under different settings. 

In the first part, I will introduce a decentralized Frank-Wolfe (DeFW)
algorithm that runs on a network of machines in a master-less setting. The
algorithm is suitable for a scattered data scenario, where data are stored at
agents/computers which cooperate to solve a common problem. The DeFW
algorithm is "projection-free" such that it handles high dimensional
constraints without the pain of projection, and thus resulting in a lower
runtime complexity than the state-of-the-art projection-based methods. In
terms of convergence rate, we show that the proposed method converges at
least sublinearly for convex and non-convex (to a stationary point)
optimization problems and the rate depends on the connectivity of the
underlying network. As a sidetrack, I will also discuss about a simple online
Frank-Wolfe algorithm for stochastic optimization. 

In the second part, I will present a curvature-aided incremental aggregated
gradient (CIAG) method for finite sum optimization. This algorithm is
suitable for the "big-data" setting, where the objective function consists of
$m$ component functions with $m \gg 1$. Such problem is tackled using an
incremental optimization scheme, i.e., considering one component function at
a time. Here, the novelty is to improve the gradient tracking performance
with the aids of curvature / Hessian information. We show that for strongly
convex problems, the CIAG method converges linearly at a rate that is
comparable to evaluating one iteration for the full gradient method, and is
$m$ times faster than the IAG method. Note that IAG is the deterministic
counterpart of the popular SAG method.

The efficacies of both methods are supported by a number of numerical
experiments on synthetic and real data. This is a joint work with Jean Lafond
(Telecom Paristech), Eric Moulines (Ecole Polytechnique), Wei Shi, Angelia
Nedich and Anna Scaglione (ASU).

Bio:

Read more:
http://eecs.oregonstate.edu/colloquium/first-order-optimization-methods-... 
[1]


[1] 
http://eecs.oregonstate.edu/colloquium/first-order-optimization-methods-large-scale-problems
_______________________________________________
Colloquium mailing list
[email protected]
https://secure.engr.oregonstate.edu/mailman/listinfo/colloquium
  • [EECS Colloquium] ... School of Electrical Engineering & Computer Science
    • [EECS Colloqu... School of Electrical Engineering & Computer Science

Reply via email to