A Generic Hybrid 2PC Framework with Application to Private Inference of Unmodified Neural Networks

Master Thesis

Published in Privacy in Machine Learning Workshop (PriML@NeurIPS'21)

Author

Lennart Braun

Abstract

This thesis presents a new framework for mixed-protocol secure two-party computation (2PC) in the semi-honest setting based on the recent MOTION framework (Braun et al.). We implement five different 2PC protocols – Yao’s garbled circuits, arithmetic and Boolean variants of Goldreich-Micali-Wigderson (GMW), and two secret-sharing-based protocols from ABY 2.0 (Patra et al.) – together with 20 conversions among each other and new optimizations. Our framework’s modular architecture based on flexible and reusable components allows researchers to rapidly implement new 2PC protocols and create a system which is tailored to their needs. An extensive analysis and benchmarks of the protocol combinations as well as comparisons with prior work are included.

Moreover, we explore the feasibility of evaluating neural networks with 2PC without making modifications to their structure. We extend MOTION with secure tensor data types and specialized building blocks for common tensor operations. With the support for the Open Neural Network Exchange (ONNX) file format, this yields an easy-to-use solution for securely evaluating neural networks. We show that exploitation of the networks’ high-level structure yields a significantly better performance than evaluating the corresponding low-level hybridcircuits. Overall, the performance is comparable with recent works while preserving the network structure and using only common 2PC techniques.

Supervisors

Publications