image:wibisono
      Andre Wibisono

YALE UNIVERSITY

andre.wibisono [at] yale [dot] edu

I am an assistant professor in the department of Computer Science at Yale University. My research interests are in algorithmic methods for machine learning, in particular for problems in optimization, sampling, and game dynamics.

I received my BS degrees in Mathematics and in Computer Science from MIT. I received my Masters in Computer Science from MIT and in Statistics from UC Berkeley. I received my PhD in Computer Science at UC Berkeley. Before joining Yale, I have done postdoctoral research at UW Madison and Georgia Tech.

In Spring 2021, I am teaching CPSC 661: Sampling Algorithms in Machine Learning.

RESEARCH

Last-iterate convergence rates for min-max optimization
Jacob Abernethy, Kevin Lai, and Andre Wibisono
ALT (Algorithmic Learning Theory) 2021
Fast Convergence of Fictitious Play for Diagonal Payoff Matrices
Jacob Abernethy, Kevin Lai, and Andre Wibisono
SODA (Symposium on Discrete Algorithms) 2021
Proximal Langevin Algorithm: Rapid convergence under isoperimetry
Andre Wibisono
arXiv preprint arXiv:1911.01469, 2019
Rapid convergence of the Unadjusted Langevin Algorithm: Isoperimetry suffices
Santosh Vempala and Andre Wibisono
NeurIPS (Neural Information Processing System) 2019
arXiv version | poster
Accelerating Rescaled Gradient Descent: Fast optimization of smooth functions
Ashia Wilson, Lester Mackey, and Andre Wibisono
NeurIPS (Neural Information Processing System) 2019
Convexity of mutual information along the Ornstein-Uhlenbeck flow
Andre Wibisono and Varun Jog
ISITA (International Symposium on Information Theory and Applications) 2018
Sampling as optimization in the space of measures: The Langevin dynamics as a composite optimization problem
Andre Wibisono
COLT (Conference on Learning Theory) 2018
Convexity of mutual information along the heat flow
Andre Wibisono and Varun Jog
ISIT (International Symposium on Information Theory) 2018
Information and estimation in Fokker-Planck channels
Andre Wibisono, Varun Jog, and Po-Ling Loh
ISIT (International Symposium on Information Theory) 2017
A variational perspective on accelerated methods in optimization
Andre Wibisono, Ashia Wilson, and Michael Jordan
Proceedings of the National Academy of Sciences, 133, E7351--E7358, 2016. [arXiv version]
Optimal rates for zero-order convex optimization: the power of two function evaluations
John Duchi, Michael Jordan, Martin Wainwright, and Andre Wibisono
IEEE Transactions on Information Theory, 61(5): 2788--2806, May 2015
A Hadamard-type lower bound for symmetric diagonally dominant positive matrices
Christopher Hillar and Andre Wibisono
Linear Algebra and Applications, 472: 135--141, 2015
Convexity of reweighted Kikuchi approximation
Po-Ling Loh and Andre Wibisono
NIPS (Neural Information Processing System) 2014
How to hedge an option against an adversary: Black-Scholes pricing is minimax optimal
Jake Abernethy, Peter Bartlett, Rafael Frongillo, and Andre Wibisono
NIPS (Neural Information Processing System) 2013
Streaming variational Bayes
Tamara Broderick, Nicholas Boyd, Andre Wibisono, Ashia Wilson, and Michael Jordan
NIPS (Neural Information Processing System) 2013
Maximum entropy distributions on graphs
Christopher Hillar and Andre Wibisono
arXiv preprint arXiv:1301.3321, 2013
Inverses of symmetric, diagonally dominant positive matrices and applications
Christopher Hillar, Shaowei Lin, and Andre Wibisono
arXiv preprint arXiv:1203.6812, 2013
Finite sample convergence rates of zero-order stochastic optimization methods
John Duchi, Michael Jordan, Martin Wainwright, and Andre Wibisono
NIPS (Neural Information Processing System) 2012
Minimax option pricing meets Black-Scholes in the limit
Jacob Abernethy, Rafael Frongillo, and Andre Wibisono
STOC (Symposium on the Theory of Computing) 2012

THESES

Variational and Dynamical Perspectives on Learning and Optimization
PhD in Computer Science, University of California, Berkeley, May 2016
Maximum Entropy Distributions on Graphs
MA in Statistics, University of California, Berkeley, May 2013
Generalization and Properties of the Neural Response
MEng in Computer Science, Massachusetts Institute of Technology, June 2010