Efficient Privacy-Preserving Neural Network Training with TEEs

Seminar Thesis


Privacy-preserving machine learning (PPML) is a hot topic in the privacy research community. Especially private deep learning has gained a lot of attention in the last few years. However, most early works focus on privacy-preserving inference, e.g., [1-3], assuming an already trained model. But to leverage the vast amount of data available distributed across many devices, it is necessary to collaboratively train models. Additionally, outsourcing the computation is required due to the resource limitations at the edge. Both multi-party computation and outsourcing require to preserve the privacy of the data used for training. Just recently, a few works have started to investigate the privacy-preserving training of deep neural networks using function sharing or secure multi-party computation techniques [4-11].


An interesting work by Tramèr and Boneh [3] leverages a synergy of a trusted execution environment such as Intel SGX and blinding with random values to design a highly-efficient inference system. This thesis should extend their work to privacy-preserving training. The goal is to first design a protocol that splits the computation needed to train classical neural networks between the TEE and the untrusted operating system. An option is to investigate the efficient conversion between several secure computation techniques with TEEs. The protocol should balance between the secure environment with limited resources of the TEE and the untrusted but powerful operating system that enables multi-threading, GPU usage, etc., to achieve maximal efficiency while protecting data privacy.

The student should first review related work and identify possibilities for accelerating the neural network training by using TEEs. Then, the protocol should be designed, implemented and benchmarked on various model sizes as well as theoretically and experimentally compared to related work.


  • Good programming skills in C/C++
  • Basic knowledge of cryptography
  • First experiences with TEEs (e.g., Intel SGX)
  • Good mathematical background
  • Sound background knowledge on neural networks
  • High motivation + ability to work independently
  • Knowledge of the English language, Git, LaTeX, etc. goes without saying