DM-DPL: Toward Discrete Matrixing Differentially Private Learning

  • Jinhao Zhou
  • , Zhou Su
  • , Yuntao Wang
  • , Jun Wu

Research output: Contribution to journalArticlepeer-review

Abstract

Differential private learning is widely used in machine learning (ML) to protect continuous and scalar-valued data. The demand for discrete and matrix-valued computations is increasing, particularly in quantized neural networks and graph learning, which require discrete-valued parameters and large-scale matrix operations for efficient data processing. However, privacy protection for discrete and matrix-valued data is less explored. Traditional differential private mechanisms fail to maintain the discrete nature of data after perturbation and often overlook data correlations, struggling to balance privacy and utility. In this paper, we propose a Discrete Matrixing Differentially Private Learning (DM-DPL) framework, which protects the privacy of discrete and matrix-valued data during ML training by adding discrete matrix-variate Gaussian noise. First, we propose a novel Discrete Matrix-Variate Gaussian (DMVG) mechanism with rigorous conditions necessary to guarantee (∈, δ) -differential privacy. Additionally, we present an eigenvalue-weighted analysis-based precision budget allocation strategy, designed to maintain the utility of significant dimensions while providing consistent privacy guarantees. Finally, the results illustrate that our approach significantly surpasses existing state-of-the-art methods when applied to quantized federated learning. To the best of our knowledge, this is the first work to specifically protect discrete and matrix-valued data during ML training.

Original languageEnglish
Pages (from-to)6149-6161
Number of pages13
JournalIEEE Transactions on Information Forensics and Security
Volume20
DOIs
StatePublished - 2025

Keywords

  • Machine learning
  • differential privacy
  • discrete Gaussian
  • matrix-variate Gaussian

Fingerprint

Dive into the research topics of 'DM-DPL: Toward Discrete Matrixing Differentially Private Learning'. Together they form a unique fingerprint.

Cite this