[go: up one dir, main page]

What a lovely hat

Is it made out of tin foil?

Paper 2023/073

FssNN: Communication-Efficient Secure Neural Network Training via Function Secret Sharing

Peng Yang, Harbin Institute of Technology, ShenZhen
Zoe Lin Jiang, Harbin Institute of Technology, ShenZhen
Shiqi Gao, Harbin Institute of Technology, ShenZhen
Hongxiao Wang, The University of Hong Kong
Jun Zhou, Shanghai Pudong Development Bank
Yangyiye Jin, Shanghai Pudong Development Bank
Siu-Ming Yiu, The University of Hong Kong
Junbin Fang, Jinan University
Abstract

Privacy-preserving neural network based on secure multi-party computation (MPC) enables multiple parties to jointly train neural network models without revealing sensitive data. In privacy-preserving neural network, the high communication costs of securely computing non-linear functions is the primary performance bottleneck. For commonly used non-linear functions, such as ReLU, existing work adopts an offline-online computation paradigm and utilizes distributed comparison function (DCF) to reduce communication costs. Specifically, these works prepare DCF keys in the offline phase and perform secure ReLU using these DCF keys in the online phase. However, the practicality of existing work is significantly limited due to the substantial size of DCF keys and the heavy reliance on a trusted third party in the offline phase. In this work, we introduce a communication-efficient secure two-party neural network framework called FssNN, which proposes a key-reduced DCF scheme without a trusted third party to enable practical secure training and inference. First, by analyzing the correlations between DCF keys to eliminate redundant parameters, we propose a key-reduced DCF scheme with a compact additive construction, which decreases the size of DCF keys by about $17.9\%$ and the offline communication costs by approximately $28.0\%$. Secondly, by leveraging an MPC-friendly pseudorandom number generator, we propose a secure two-party distributed key generation protocol for our key-reduced DCF, thereby eliminating the reliance on the trusted third party. Finally, we utilize the key-reduced DCF and additive secret sharing to compute non-linear and linear functions, respectively, and design secure computation protocols with constant online communication rounds for neural network operators, reducing the online communication costs by $28.9\% \sim 43.4\%$. We provide formal security proofs and evaluate the performance of FssNN on various models and datasets. Experimental results show that compared to the state-of-the-art framework AriaNN, our framework reduces the total communication costs of secure training and inference by approximately $25.4\%$ and $26.4\%$ respectively.

Note: This is an updated version, with revised authorship and acceptance information added.

Metadata
Available format(s)
PDF
Category
Cryptographic protocols
Publication info
Published elsewhere. Minor revision. ProvSec 2024
Keywords
Privacy-preserving neural networkSecure multi-party computationAdditive secret sharingFunction secret sharing
Contact author(s)
stuyangpeng @ stu hit edu cn
zoeljiang @ hit edu cn
History
2024-07-26: last of 5 revisions
2023-01-22: received
See all versions
Short URL
https://ia.cr/2023/073
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2023/073,
      author = {Peng Yang and Zoe Lin Jiang and Shiqi Gao and Hongxiao Wang and Jun Zhou and Yangyiye Jin and Siu-Ming Yiu and Junbin Fang},
      title = {{FssNN}: Communication-Efficient Secure Neural Network Training via Function Secret Sharing},
      howpublished = {Cryptology {ePrint} Archive, Paper 2023/073},
      year = {2023},
      url = {https://eprint.iacr.org/2023/073}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.