Papers
Topics
Authors
Recent
Search
2000 character limit reached

A Practical Scheme for Two-Party Private Linear Least Squares

Published 26 Jan 2019 in cs.CR | (1901.09281v1)

Abstract: Privacy-preserving machine learning is learning from sensitive datasets that are typically distributed across multiple data owners. Private machine learning is a remarkable challenge in a large number of realistic scenarios where no trusted third party can play the role of a mediator. The strong decentralization aspect of these scenarios requires tools from cryptography as well as from distributed systems communities. In this paper, we present a practical scheme that is suitable for a subclass of machine learning algorithms and investigate the possibility of conducting future research. We present a scheme to learn a linear least squares model across two parties using a gradient descent approach and additive homomorphic encryption. The protocol requires two rounds of communication per step of gradient descent. We detail our approach including a fixed point encoding scheme, and one time random pads for hiding intermediate results.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.