Implications of PAC-learning separations for private distribution learning
Determine whether the known separations between non-private PAC learning and differentially private PAC learning of functions—under both (ε,0)-differential privacy and (ε,δ)-differential privacy, where approximate-DP learnability is characterized by the Littlestone dimension rather than the VC dimension—have any implications for the sample complexity or structural characterizations of privately learning distributions in total variation distance.
Sponsor
References
On the related task of PAC learning of functions, a rich line of work shows that there exist strong separations between non-private learning and private learning, under both $(, 0)$-DP and $(, \delta)$-DP. In particular, for approximate DP, learnability is characterized by the Littlestone dimension, rather than the VC dimension as in the non-private setting. However, given substantial differences in the setting, it is unclear whether these separations have any implications for private distribution learning.