[go: up one dir, main page]

THE-X: Privacy-Preserving Transformer Inference with Homomorphic Encryption

Tianyu Chen, Hangbo Bao, Shaohan Huang, Li Dong, Binxing Jiao, Daxin Jiang, Haoyi Zhou, Jianxin Li, Furu Wei


Abstract
As more and more pre-trained language models adopt on-cloud deployment, the privacy issues grow quickly, mainly for the exposure of plain-text user data (e.g., search history, medical record, bank account). Privacy-preserving inference of transformer models is on the demand of cloud service users. To protect privacy, it is an attractive choice to compute only with ciphertext in homomorphic encryption (HE). However, enabling pre-trained models inference on ciphertext data is difficult due to the complex computations in transformer blocks, which are not supported by current HE tools yet. In this work, we introduce THE-X, an approximation approach for transformers, which enables privacy-preserving inference of pre-trained models developed by popular frameworks. THE-X proposes a workflow to deal with complex computation in transformer networks, including all the non-polynomial functions like GELU, softmax, and LayerNorm. Experiments reveal our proposed THE-X can enable transformer inference on encrypted data for different downstream tasks, all with negligible performance drop but enjoying the theory-guaranteed privacy-preserving advantage.
Anthology ID:
2022.findings-acl.277
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3510–3520
Language:
URL:
https://aclanthology.org/2022.findings-acl.277
DOI:
10.18653/v1/2022.findings-acl.277
Bibkey:
Cite (ACL):
Tianyu Chen, Hangbo Bao, Shaohan Huang, Li Dong, Binxing Jiao, Daxin Jiang, Haoyi Zhou, Jianxin Li, and Furu Wei. 2022. THE-X: Privacy-Preserving Transformer Inference with Homomorphic Encryption. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3510–3520, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
THE-X: Privacy-Preserving Transformer Inference with Homomorphic Encryption (Chen et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.277.pdf
Software:
 2022.findings-acl.277.software.zip
Data
CoNLL 2003GLUEMRPCMultiNLIQNLISSTSST-2