Optimal algorithms for differentially private stochastic monotone variational inequalities and saddle-point problems

Boob, Digvijay; Guzman, Cristobal

Abstract

In this work, we conduct the first systematic study of stochastic variational inequality (SVI) and stochastic saddle point (SSP) problems under the constraint of differential privacy (DP). We propose two algorithms: Noisy Stochastic Extragradient (NSEG) and Noisy Inexact Stochastic Proximal Point (NISPP). We show that a stochastic approximation variant of these algorithms attains risk bounds vanishing as a function of the dataset size, with respect to the strong gap function; and a sampling with replacement variant achieves optimal risk bounds with respect to a weak gap function. We also show lower bounds of the same order on weak gap function. Hence, our algorithms are optimal. Key to our analysis is the investigation of algorithmic stability bounds, both of which are new even in the nonprivate case. The dependence of the running time of the sampling with replacement algorithms, with respect to the dataset size n, is n(2) for NSEG and O (n(3/2)) for NISPP.

Más información

Título según WOS: Optimal algorithms for differentially private stochastic monotone variational inequalities and saddle-point problems
Título según SCOPUS: ID SCOPUS_ID:85153081205 Not found in local SCOPUS DB
Título de la Revista: MATHEMATICAL PROGRAMMING
Editorial: SPRINGER HEIDELBERG
Fecha de publicación: 2023
DOI:

10.1007/S10107-023-01953-5

Notas: ISI, SCOPUS