Abstract
Differentially private stochastic gradient descent (DP-SGD) that perturbs the clipped gradients is a popular approach for private machine learning. Gaussian mechanism GM, combined with the moments accountant (MA), has demonstrated a much better privacy-utility tradeoff than using the advanced composition theorem. However, it is unclear whether the tradeoff can be further improved by other mechanisms with different noise distributions. To this end, we extend GM (p=2) to the generalizedp-power exponential mechanism (pEM withp>0) family and show its privacy guarantee. Straightforwardly, we can enhance the privacy-utility tradeoff of GM by searching noise distribution in the wider mechanism space. To implementpEM in practice, we design an effective sampling method and extend MA topEM for tightly estimating privacy loss. Besides, we formally prove the non-optimality of GM based on the variation method. Numerical experiments validate the properties ofpEM and illustrate a comprehensive comparison betweenpEM and the other two state-of-the-art methods. Experimental results show thatpEM is preferred when the noise variance is relatively small to the signal and the dimension is not too high.
| Original language | English |
|---|---|
| Pages (from-to) | 155018-155034 |
| Number of pages | 17 |
| Journal | IEEE Access |
| Volume | 9 |
| DOIs | |
| State | Published - 2021 |
Keywords
- Gaussian mechanism
- Privacy protection
- moments accountant
- noise variance
- privacy-utility trade-off