Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Estimating the Lipschitz constant of a function, also known as Lipschitz learning, is a funda-mental problem with broad applications in fields such as control and global optimization. In this paper, we study the Lipschitz learning problem with minimal parametric assumptions on the target function. As a first theoretical contribution, we derive novel lower bounds on the sample complexity of this problem for both noise-free and noisy settings under mild assumptions. Moreover, we propose a simple Lipschitz learning algorithm called Lipschitz Constant Estimation by Least Squares Regression (referred to as LCLS). We show that LCLS is asymptotically consistent for general noise assumptions and offers finite sample guarantees that can be translated to new upper bounds on the sample complexity of the Lipschitz learning problem. Our analysis shows that the sample complexity rates derived in this paper are optimal in both the noise-free setting and in the noisy setting when the noise is assumed to follow a Gaussian distribution and that LCLS is a sample-optimal algorithm in both cases. Finally, we show that by design, the LCLS algorithm is computationally faster than existing theoretically consistent methods, and can be readily adapted to various noise assumptions with little to no prior knowledge of the target function properties or noise distribution.

Type

Journal article

Journal

Transactions on Machine Learning Research

Publication Date

01/09/2023

Volume

2023