<italic>p</italic>-Power Exponential Mechanisms for Differentially Private Machine Learning
Differentially private stochastic gradient descent (DP-SGD) that perturbs the clipped gradients is a popular approach for private machine learning. Gaussian mechanism GM, combined with the moments accountant (MA), has demonstrated a much better privacy-utility tradeoff than using the advanced compos...
Enregistré dans:
Auteurs principaux: | Yanan Li, Xuebin Ren, Fangyuan Zhao, Shusen Yang |
---|---|
Format: | article |
Langue: | EN |
Publié: |
IEEE
2021
|
Sujets: | |
Accès en ligne: | https://doaj.org/article/d91648a81c8e4395a2b8d3247e9c873c |
Tags: |
Ajouter un tag
Pas de tags, Soyez le premier à ajouter un tag!
|
Documents similaires
-
Review: Privacy-Preservation in the Context of Natural Language Processing
par: Darshini Mahendran, et autres
Publié: (2021) -
Location privacy protection scheme for LBS users based ondifferential privacy
par: Naiwen YU, et autres
Publié: (2021) -
Enhancing Differential Privacy for Federated Learning at Scale
par: Chunghun Baek, et autres
Publié: (2021) -
Differentially private partition selection
par: Desfontaines Damien, et autres
Publié: (2022) -
RcDT: Privacy Preservation Based on R-Constrained Dummy Trajectory in Mobile Social Networks
par: Jinquan Zhang, et autres
Publié: (2019)