YUAN GAO
  • Home
  • Experience
Picture
​Welcome to Yuan's site! I have been a PhD student in Operations Research at Columbia University advised by Prof. Christian Kroer. I also work with Prof. Don Goldfarb. I study optimization models and methods for game theory, market design and machine learning. I successfully defended my doctoral dissertation in June 2022. Previously, I completed my undergraduate studies at National University of Singapore (NUS) and was fortunate enough to work on an honors thesis advised by Prof. Kim-Chuan Toh and Prof. Melvyn Sim. There, I also studied optimization and received invaluable advice from Prof. Defeng Sun. For more information, please refer to my CV.
Email: gao[.]yuan[@]columbia[.]edu (without the square brackets)
Research
  • Nonstationary Dual Averaging and Online Fair Allocation, with Luofeng Liao and Christian Kroer. Submitted.
  • Infinite-Dimensional Fisher Markets and Tractable Fair Division, with Christian Kroer. Accepted at Operations Research. A short version appeared in AAAI 2021.
    • ​A generalization of the Eisenberg-Gale framework for Fisher market equilibria to a continuum of goods, which leads to a scalable optimization-based method for computing equilibrium/fair allocations.
  • Online Market Equilibrium with Application to Fair Division, with Christian Kroer and Alex Peysakhovich. An updated version accepted at NeurIPS 2021.​​​
    • A distributed, interpretable mechanism for dividing sequentially arriving goods among agents with heterogeneous valuations.
  • Increasing Iterate Averaging for Solving Saddle-Point Problems, with Christian Kroer and Don Goldfarb. AAAI 2021.​​​
    • A simple, highly effective numerical technique for solving zero-sum game and other saddle-point problems, with theoretical guarantees and extensive numerical experiments demonstrating the significant speedup.
  • First-Order Methods for Large-Scale Market Equilibrium Computation, with Christian Kroer. NeurIPS 2020.
    • Convex optimization characterizations and efficient first-order methods for computing Fisher market equilibria, with application to Internet ad auction, resource allocation and fair recommender systems.
  • An Improved Analysis of Stochastic Gradient Descent with Momentum, with Yanli Liu and Wotao Yin. NeurIPS 2020.
    • ​Analysis of a multi-stage version of SGD with momentum, a widely used heuristic in deep learning training, with experiments demonstrating its advantage.
  • Stochastic Flows and Geometric Optimization on the Orthogonal Group, with Krzysztof Choromanski et al., based on a course project. ICML 2020.
  • Accurate Protein Structure Prediction by Embeddings and Deep Learning Representations, with Iddo Drori et al., based on a course project. MLCB 2019.
  • A Homogeneous Interior-Point Method for Conic Programming Involving Exponential Cone Constraints. NUS Honors Thesis.​
Powered by Create your own unique website with customizable templates.
  • Home
  • Experience