David Eriksson

deriksson@fb.com

I do research in machine learning with a focus on Bayesian optimization and Gaussian processes. My work has primarily focused on scaling these methods to complex high-dimensional problems.

I am currently a Sr. Research Scientist at Facebook. I was previously a Sr. Research Scientist at Uber AI Labs and before that I received my Ph.D. in Applied Mathematics from Cornell University where I was advised by Professor David Bindel.

I am a main organizer of the NeurIPS 2020 black-box optimization competition with the goal of finding the best black-box optimizer for machine learning.

News

  • July, 2021: I wrote a blog post on our work on High-dimensional Bayesian optimization with sparsity-inducing priors.
  • July, 2021: We wrote a blog post on our work on Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization.
  • June, 2021: Our paper on Latency-Aware Neural Architecture Search with Multi-Objective Bayesian Optimization was accepted to the Workshop on Automated Machine Learning at ICML 2021.
  • May, 2021: Two papers accepted to UAI 2021: (1) High-Dimensional Bayesian Optimization with Sparse Axis-Aligned Subspaces, (2) A Nonmyopic Approach to Cost-Constrained Bayesian Optimization.
  • April, 2021: The paper that describes the key learnings from our NeurIPS 2020 black-box optimization competition competition is now on arXiv.
  • April, 2021: I had a great time attending AISTATS 2021 and presented our poster on Scalable Constrained Bayesian Optimization (SCBO).
  • March, 2021: I attended SIAM CSE 21 and gave two talks. My first talk was about TuRBO in Derivative-Free Optimization Methods for Solving Expensive Global Black-Box Problems (MS130) and my second talk was about SCBO in Bayesian Methods in Science and Engineering (MS217).
  • Jan, 2021: Our paper on Scalable Constrained Bayesian Optimization (SCBO) was accepted to AISTATS 2021.
  • Dec, 2020: Our NeurIPS 2020 black-box optimization competition has concluded. Congratulations to the teams that participated and thank you for the interesting submissions! Final leaderboard at bbochallenge.com.
  • Oct, 2020: Our paper on Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization was accepted to NeurIPS 2020.
  • Aug, 2020: I have joined Facebook as a Research Scientist!
  • June, 2020: Our paper on Efficient Rollout Strategies for Bayesian Optimization was accepted to UAI 2020.
  • May, 2020: The proposal for our NeurIPS 2020 black-box optimization competition was accepted!
  • Publications

    See also my Google scholar page.

    High-Dimensional Bayesian Optimization with Sparse Axis-Aligned Subspaces [Preprint]
    David Eriksson, Martin Jankowiak
    Uncertainty in Artificial Intelligence (UAI 2021), To appear

    A Nonmyopic Approach to Cost-Constrained Bayesian Optimization [Preprint]
    Eric Lee, David Eriksson, Valerio Perrone, Matthias Seeger
    Uncertainty in Artificial Intelligence (UAI 2021), To appear

    Bayesian Optimization is Superior to Random Search for Machine Learning Hyperparameter Tuning:
    Analysis of the Black-Box Optimization Challenge 2020 [Preprint]

    Ryan Turner, David Eriksson, Michael McCourt, Juha Kiili, Eero Laaksonen, Zhen Xu, Isabelle Guyon
    Post Proceedings of the Competitions & Demonstrations Track @ NeurIPS2020, To appear

    Scalable Constrained Bayesian Optimization [Paper]
    David Eriksson, Matthias Poloczek
    Artificial Intelligence and Statistics (AISTATS 2021)

    Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization [Paper] [Code]
    Geoff Pleiss, Martin Jankowiak, David Eriksson, Anil Damle, Jacob R. Gardner
    Neural Information Processing Systems (NeurIPS 2020)

    Efficient Rollout Strategies for Bayesian Optimization [Paper] [Code]
    Eric Hans Lee, David Eriksson, Bolong Cheng, Michael McCourt, David Bindel
    Uncertainty in Artificial Intelligence (UAI 2020)

    Scalable Global Optimization via Local Bayesian Optimization [Paper] [Code]
    David Eriksson, Michael Pearce, Jacob R. Gardner, Ryan Turner, Matthias Poloczek
    Neural Information Processing Systems (NeurIPS 2019)
    (Spotlight, acceptance rate = 3.0%)

    Scaling Gaussian Process Regression with Derivatives [Paper]
    David Eriksson, Kun Dong, Eric Lee, David Bindel, Andrew G. Wilson
    Neural Information Processing Systems (NeurIPS 2018)

    Scalable log determinants for Gaussian process kernel learning [Paper]
    Kun Dong, David Eriksson, Hannes Nickisch, David Bindel, Andrew G. Wilson
    Neural Information Processing Systems (NeurIPS 2017)

    Fast exact shortest distance queries for massive point clouds [Paper]
    David Eriksson, Evan Shellshear
    Graphical Models (2016)

    Tropospheric delay ray tracing applied in VLBI analysis [Paper]
    David Eriksson, Daniel S. MacMillan, John M. Gipson
    Journal of Geophysical Research: Solid Earth (2014)

    Continental hydrology loading observed by VLBI measurements [Paper]
    David Eriksson, Daniel S. MacMillan
    Journal of Geodesy (2014)

    Mountains

    Mt Whitney (14,505 ft), Inyo/Sequoia
    Half Dome (8,839 ft), Yosemite
    Crater Lake
    Mt Williamson (14,379 ft), Inyo
    Angels Landing (5,790 ft), Zion
    South Sister (10,363 ft), Three Sisters Wilderness
    Longs Peak (14,259 ft), Rocky Mountains
    Clouds Rest (9,931 ft), Yosemite
    Telescope Peak (11,043 ft), Death Valley
    Mt Shasta (14,179 ft), Shasta–Trinity
    Wheeler Peak (13,064 ft), Great Basin
    Grand Canyon