These labels were automatically added by AI and may be inaccurate. For details, see About Literature Database.
Abstract
Homomorphic encryption (HE) is a promising technique used for
privacy-preserving computation. Since HE schemes only support primitive
polynomial operations, homomorphic evaluation of polynomial approximations for
non-polynomial functions plays an important role in privacy-preserving machine
learning. In this paper, we introduce a simple solution to approximating any
functions, which might be overmissed by researchers: just using the neural
networks for regressions. By searching decent superparameters, neural networks
can achieve near-optimal computation depth for a given function with fixed
precision, thereby reducing the modulus consumed.
There are three main reasons why we choose neural networks for homomorphic
evaluation of polynomial approximations. Firstly, neural networks with
polynomial activation functions can be used to approximate whatever functions
are needed in an encrypted state. This means that we can compute by one unified
process for any polynomial approximation, such as that of Sigmoid or of ReLU.
Secondly, by carefully finding an appropriate architecture, neural networks can
efficiently evaluate a polynomial using near-optimal multiplicative depth,
which would consume less modulus and therefore employ less ciphertext
refreshing. Finally, as popular tools, model neural networks have many
well-studied techniques that can conveniently serve our solution.
Experiments showed that our method can be used for approximation of various
functions. We exploit our method to the evaluation of the Sigmoid function on
large intervals $[-30, +30]$, $[-50, +50]$, and $[-70, +70]$, respectively.