1. Inicio keyboard_arrow_right
  2. Noticias keyboard_arrow_right
  3. Seminario: Searching for Optimal Per-Coordinate Step-sizes with Multidimensional Backtracking

Seminario: Searching for Optimal Per-Coordinate Step-sizes with Multidimensional Backtracking

2 de Octubre, 2025


El Instituto de Ingeniería Matemática y Computacional (IMC) invita al seminario que se dictará este miércoles 8 de octubre.

Victor Sanches Portella, Institute of Mathematics and Statistics, University of São Paulo (IME-USP)

Miércoles 8 de octubre de 2025, 13:40 hrs. (Presencial en auditorio Edificio San Agustín. Link Zoom disponible escribiendo a imc@uc.cl)

ABSTRACT

Backtracking line-search is an effective technique to automatically tune the step-size in smooth optimization, guaranteeing performance similar to what's achieved with the theoretically optimal step-size. Many approaches have been developed to instead tune per-coordinate step-sizes, also known as diagonal preconditioners, but none of the existing methods are provably competitive with the optimal per-coordinate stepsizes.

In this talk, I will first give an introduction to this problem and discuss many "adaptive" methods, particularly those well-known in machine learning, that construct preconditioners during the optimization process. I will then present multidimensional backtracking, an extension of backtracking line-search to find good diagonal preconditioners for smooth convex problems. Our key insight is that the hypergradients—the gradient with respect to the step-sizes—yield separating hyperplanes that allow us to search for good preconditioners using cutting-plane methods. As black-box cutting-plane approaches like the ellipsoid method are computationally prohibitive, we develop an efficient algorithm tailored to our setting. Multidimensional backtracking is provably competitive with the best diagonal preconditioner and requires no manual tuning.

BIO

Victor is a postdoctoral researcher at the Institute of Mathematics, Statistics and Computer Science of the University of São Paulo (IME - USP), Brazil, supervised by Prof. Yoshiharu Kohayakawa. His research focuses on online learning, randomized algorithms, and differential privacy, with broader interests in theoretical computer science, optimization, and learning theory. He completed his PhD in Computer Science at UBC in 2024 under the supervision of Prof. Nick Harvey, and earned his MSc in Computer Science from USP in 2019, advised by Prof. Marcel K. de Carli Silva.


Comparte esta publicación

Twitter Facebook email
Información
local_offer   Tema