Privacy-Preserving Machine Learning

We characterize performance overhead of privacy-preserving computation techniques, focusing on homomorphic encryption (HE) technique. Homomorphic encryption makes it possible to compute on encrypted data leveraging a huge computation overhead, which raises challenges on its actual deployment. We analyze the overhead and build its performance and cost model. We investigate the practicality of this new computing model that is capable of secure computation outsourcing.

We place emphasis on privacy-preserving machine learning/deep learning applications because recent advances in (personalized) deep learning applications can significantly facilitate our lives provided that user's privacy is protected. To make use of homomorphic encryption techniques with these applications, we optimize it on both algorithm and hardware levels. Specifically, based on the characteristics of deep learning and homomorphic encryption workloads, we look at how to fit the pipeline of homomorphic encryption operations to the hardware better. We design a specialized hardware accelerator and conduct performance and power analysis with the algorithm-level optimizations combined.