David Eriksson

deriksson@meta.com

I do research in machine learning with a focus on Bayesian optimization and Gaussian processes. My work has primarily focused on scaling these methods to complex high-dimensional problems.

I am currently a Research Scientist Manager at Meta where I manage a team of research scientists focusing on AutoML. I was previously a Sr. Research Scientist at Uber AI Labs and before that I received my Ph.D. in Applied Mathematics from Cornell University where I was advised by Professor David Bindel.

I was a main organizer of the NeurIPS 2020 black-box optimization competition with the goal of finding the best black-box optimizer for machine learning.

News

  • Oct, 2023: I co-organized a Bayesian optimization session with Peter Frazier and Jana Doppa at INFORMS.
  • Sep, 2023: I'm excited to be an Area Chair for AISTATS 2024.
  • Sep, 2023: Our paper "Unexpected improvements to expected improvement" was accepted as a NeurIPS 2023 spotlight.
  • Feb, 2023: Max and I organized a session on Bayesian optimization in the real world at SIAM CSE.
  • Jan, 2023: Three papers accepted to AISTATS 2023!
  • Aug, 2022: I'm excited to be an Area Chair for AISTATS 2023.
  • May, 2022: Our paper Multi-objective Bayesian optimization over high-dimensional search spaces was accepted as an oral at UAI 2022.
  • Feb, 2022: I gave a talk at the AutoML Seminars on our work on High-dimensional Bayesian optimization.
  • July, 2021: I wrote a blog post on our work on High-dimensional Bayesian optimization with sparsity-inducing priors.
  • July, 2021: We wrote a blog post on our work on Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization.
  • June, 2021: Our paper on Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization was accepted to the Workshop on Automated Machine Learning at ICML 2021.
  • May, 2021: Two papers accepted to UAI 2021: (1) High-Dimensional Bayesian Optimization with Sparse Axis-Aligned Subspaces, (2) A Nonmyopic Approach to Cost-Constrained Bayesian Optimization.
  • April, 2021: The paper that describes the key learnings from our NeurIPS 2020 black-box optimization competition competition is now on arXiv.
  • April, 2021: I had a great time attending AISTATS 2021 and presented our poster on Scalable Constrained Bayesian Optimization (SCBO).
  • March, 2021: I attended SIAM CSE 21 and gave two talks. My first talk was about TuRBO in Derivative-Free Optimization Methods for Solving Expensive Global Black-Box Problems (MS130) and my second talk was about SCBO in Bayesian Methods in Science and Engineering (MS217).
  • Jan, 2021: Our paper on Scalable Constrained Bayesian Optimization (SCBO) was accepted to AISTATS 2021.
  • Dec, 2020: Our NeurIPS 2020 black-box optimization competition has concluded. Congratulations to the teams that participated and thank you for the interesting submissions! Final leaderboard at bbochallenge.com.
  • Oct, 2020: Our paper on Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization was accepted to NeurIPS 2020.
  • Aug, 2020: I have joined Facebook as a Research Scientist!
  • June, 2020: Our paper on Efficient Rollout Strategies for Bayesian Optimization was accepted to UAI 2020.
  • May, 2020: The proposal for our NeurIPS 2020 black-box optimization competition was accepted!
  • Publications

    See also my Google scholar page.

    Unexpected improvements to expected improvement for Bayesian optimization [Preprint]
    Sebastian Ament, Samuel Daulton, David Eriksson, Maximilian Balandat, Eytan Bakshy
    Neural Information Processing Systems (NeurIPS 2023)

    Bayesian Optimization over High-Dimensional Combinatorial Spaces via Dictionary-based Embeddings [Paper]
    Aryan Deshwal, Sebastian Ament, Maximilian Balandat, Eytan Bakshy, Janardhan Rao Doppa, David Eriksson
    Artificial Intelligence and Statistics (AISTATS 2023)

    Discovering Many Diverse Solutions with Bayesian Optimization [Paper]
    Natalie Maus, Kaiwen Wu, David Eriksson, Jacob R. Gardner
    Artificial Intelligence and Statistics (AISTATS 2023)

    Sparse Bayesian optimization [Paper]
    Sulin Liu, Qing Feng, David Eriksson, Benjamin Letham, Eytan Bakshy
    Artificial Intelligence and Statistics (AISTATS 2023)

    Bayesian optimization over discrete and mixed spaces via probabilistic reparameterization [Paper]
    Samuel Daulton, Xingchen Wan, David Eriksson, Maximilian Balandat, Michael A Osborne, Eytan Bakshy
    Neural Information Processing Systems (NeurIPS 2022)

    Multi-objective Bayesian optimization over high-dimensional search spaces [Paper]
    Samuel Daulton, David Eriksson, Maximilian Balandat, Eytan Bakshy
    Uncertainty in Artificial Intelligence (UAI 2022)
    (Oral, acceptance rate = 16.0%)

    High-Dimensional Bayesian Optimization with Sparse Axis-Aligned Subspaces [Paper]
    David Eriksson, Martin Jankowiak
    Uncertainty in Artificial Intelligence (UAI 2021)

    A Nonmyopic Approach to Cost-Constrained Bayesian Optimization [Paper]
    Eric Lee, David Eriksson, Valerio Perrone, Matthias Seeger
    Uncertainty in Artificial Intelligence (UAI 2021)

    Bayesian Optimization is Superior to Random Search for Machine Learning Hyperparameter Tuning:
    Analysis of the Black-Box Optimization Challenge 2020 [Paper]

    Ryan Turner, David Eriksson, Michael McCourt, Juha Kiili, Eero Laaksonen, Zhen Xu, Isabelle Guyon
    Post Proceedings of the Competitions & Demonstrations Track @ NeurIPS2020

    Scalable Constrained Bayesian Optimization [Paper]
    David Eriksson, Matthias Poloczek
    Artificial Intelligence and Statistics (AISTATS 2021)

    Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization [Paper] [Code]
    Geoff Pleiss, Martin Jankowiak, David Eriksson, Anil Damle, Jacob R. Gardner
    Neural Information Processing Systems (NeurIPS 2020)

    Efficient Rollout Strategies for Bayesian Optimization [Paper] [Code]
    Eric Hans Lee, David Eriksson, Bolong Cheng, Michael McCourt, David Bindel
    Uncertainty in Artificial Intelligence (UAI 2020)

    Scalable Global Optimization via Local Bayesian Optimization [Paper] [Code]
    David Eriksson, Michael Pearce, Jacob R. Gardner, Ryan Turner, Matthias Poloczek
    Neural Information Processing Systems (NeurIPS 2019)
    (Spotlight, acceptance rate = 3.0%)

    Scaling Gaussian Process Regression with Derivatives [Paper]
    David Eriksson, Kun Dong, Eric Lee, David Bindel, Andrew G. Wilson
    Neural Information Processing Systems (NeurIPS 2018)

    Scalable log determinants for Gaussian process kernel learning [Paper]
    Kun Dong, David Eriksson, Hannes Nickisch, David Bindel, Andrew G. Wilson
    Neural Information Processing Systems (NeurIPS 2017)

    Fast exact shortest distance queries for massive point clouds [Paper]
    David Eriksson, Evan Shellshear
    Graphical Models (2016)

    Tropospheric delay ray tracing applied in VLBI analysis [Paper]
    David Eriksson, Daniel S. MacMillan, John M. Gipson
    Journal of Geophysical Research: Solid Earth (2014)

    Continental hydrology loading observed by VLBI measurements [Paper]
    David Eriksson, Daniel S. MacMillan
    Journal of Geodesy (2014)

    Mountains

    My goal is to climb all 58 14ers (summits over 14,0000 ft) in Colorado. I've currently summited 38/58.


    Mt Whitney (14,505 ft), Inyo/Sequoia
    Half Dome (8,839 ft), Yosemite
    High-Sierra Trail (3-day run)
    Crater Lake
    Mt Sneffels (14,157 ft), San Juan Mountains
    Angels Landing (5,790 ft), Zion
    South Sister (10,363 ft), Three Sisters Wilderness
    Mt Williamson (14,379 ft), Inyo
    Capitol Peak (14,130 ft), Elk Mountains
    Longs Peak (14,259 ft), Rocky Mountains
    Clouds Rest (9,931 ft), Yosemite
    Telescope Peak (11,043 ft), Death Valley
    Mt Shasta (14,179 ft), Shasta–Trinity
    Wheeler Peak (13,064 ft), Great Basin
    Grand Canyon