These labels were automatically added by AI and may be inaccurate. For details, see About Literature Database.
Abstract
Increasingly machine learning systems are being deployed to edge servers and
devices (e.g. mobile phones) and trained in a collaborative manner. Such
distributed/federated/decentralized training raises a number of concerns about
the robustness, privacy, and security of the procedure. While extensive work
has been done in tackling with robustness, privacy, or security individually,
their combination has rarely been studied. In this paper, we propose a secure
two-server protocol that offers both input privacy and Byzantine-robustness. In
addition, this protocol is communication-efficient, fault-tolerant and enjoys
local differential privacy.