2021

  1. Geometry-Aware Gradient Algorithms for Neural Architecture Search Li, L., Khodak, M., Balcan, M.F., and Talwalkar, A. In International Conference on Learning Representations 2021 [abstract] [arXiv] [html] [pdf]
  2. On Data Efficiency of Meta-learning Al-Shedivat, M, Li, L., Xing, E., and Talwalkar, A. In Artificial Intelligence and Statistics Conference 2021 [abstract] [arXiv] [pdf]

2020

  1. Weight-sharing Beyond NAS: Efficient Feature Map Selection and Federated Hyperparameter Tuning Khodak, M., Li, L., Balcan, M.F., and Talwalkar, A. In On-device Intelligence Workshop at MLSys 2020 [abstract] [pdf]
  2. A System for Massively Parallel Hyperparameter Tuning Li, L., Jamieson, K., Rostamizadeh, A., Gonina, E., Ben-tzur, J., Hardt, M., Recht, B., and Talwalkar, A. In Conference on Machine Learning Systems 2020 [abstract] [arXiv] [blog] [pdf]

2019

  1. Random Search and Reproducibility for Neural Architecture Search Li, L., and Talwalkar, A. In Conference on Uncertainty in Artificial Intelligence 2019 [abstract] [arXiv] [html] [pdf]

2018

  1. Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization Li, L., Jamieson, K., DeSalvo, G., Rostamizadeh, A., and Talwalkar, A. Journal of Machine Learning Research 2018 [abstract] [html] [pdf]
  2. Exploiting Reuse in Pipeline-Aware Hyperparameter Tuning Li, L., Sparks, E., Jamieson, K., and Talwalkar, A. In Workshop on Systems for ML at NeurIPS 2018 [abstract] [arXiv] [html] [pdf]

2017

  1. Hyperband: Bandit-Based Configuration Evaluation for Hyperparameter Optimization Li, L., Jamieson, K., DeSalvo, G., Rostamizadeh, A., and Talwalkar, A. In International Conference on Learning Representations 2017 [abstract] [html] [pdf]