# 【论文记录】Deep Learning with Differential Privacy

14 篇文章 5 订阅

## 记录几条需要搞清楚的疑问

• Introduction右边的三条
• PDF中的几条标记
• Theorem 1紧跟着的一段中如何根据the strong composition theorem得出 σ \sigma 的渐进下界？
• 介绍到the strong composition theorem时,文章中标注的参考文献是[24], 但此文中并没有the strong composition theorem的概念。

## The Moments Accountant

privacy loss random variable 的传统定义是： L M , D 1 , D 2 ( O ) = ln ⁡    P [ M ( D 1 ) = O ]    P [ M ( D 2 ) = O ] \mathcal{L}_{\mathcal M,D_1,D_2}(O) = \ln\frac{\,\, \mathbb{P}\left[\mathcal M(D_1)=O\right] \,\,}{\mathbb{P}\left[\mathcal M(D_2)=O\right]}

my id : 根据上述这个定理 2 2 的证明过程, 本文的作者是把 δ \delta 定义为了 P r [ c ( o ) ≥ ε ] Pr[c(o)\ge \varepsilon] 的值, 但实际上此 δ \delta 可以更小 …?

## 其他阅读顺序

22 基础铺垫
8,53 本文是他们的跟进、延伸
+++++++++++++++++++++++++++++++++++++++++++++++++++++
24 关于strong composition theorem
+++++++++++++++++++++++++++++++++++++++++++++++++++++
42 关于privacy accountant
44 关于The moments accountant的来源: Renyi Differential Privacy
+++++++++++++++++++++++++++++++++++++++++++++++++++++
25 differentially private PCA algorithm
+++++++++++++++++++++++++++++++++++++++++++++++++++++
9 在 Moments accountant 中用到的 the privacy amplification theorem
+++++++++++++++++++++++++++++++++++++++++++++++++++++
58 用convex empirical risk minimization对数据集MNIST的实验精度

11 作为22的推广
23 advanced composition theorems and their refinements

• 0
点赞
• 3
评论
• 10
收藏
• 一键三连
• 扫一扫，分享海报

04-04

03-27 1850
08-01 2642
08-19 869
03-08 439
04-10 2485