<italic>p</italic>-Power Exponential Mechanisms for Differentially Private Machine Learning
Differentially private stochastic gradient descent (DP-SGD) that perturbs the clipped gradients is a popular approach for private machine learning. Gaussian mechanism GM, combined with the moments accountant (MA), has demonstrated a much better privacy-utility tradeoff than using the advanced compos...
Guardado en:
Autores principales: | Yanan Li, Xuebin Ren, Fangyuan Zhao, Shusen Yang |
---|---|
Formato: | article |
Lenguaje: | EN |
Publicado: |
IEEE
2021
|
Materias: | |
Acceso en línea: | https://doaj.org/article/d91648a81c8e4395a2b8d3247e9c873c |
Etiquetas: |
Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
Ejemplares similares
-
Review: Privacy-Preservation in the Context of Natural Language Processing
por: Darshini Mahendran, et al.
Publicado: (2021) -
Location privacy protection scheme for LBS users based ondifferential privacy
por: Naiwen YU, et al.
Publicado: (2021) -
Enhancing Differential Privacy for Federated Learning at Scale
por: Chunghun Baek, et al.
Publicado: (2021) -
Differentially private partition selection
por: Desfontaines Damien, et al.
Publicado: (2022) -
RcDT: Privacy Preservation Based on R-Constrained Dummy Trajectory in Mobile Social Networks
por: Jinquan Zhang, et al.
Publicado: (2019)