These labels were automatically added by AI and may be inaccurate. For details, see About Literature Database.
Abstract
Out-of-Distribution (OOD) detection is critical for ensuring the reliability
of machine learning models in safety-critical applications such as autonomous
driving and medical diagnosis. While deploying personalized OOD detection
directly on edge devices is desirable, it remains challenging due to large
model sizes and the computational infeasibility of on-device training.
Federated learning partially addresses this but still requires gradient
computation and backpropagation, exceeding the capabilities of many edge
devices. To overcome these challenges, we propose SecDOOD, a secure
cloud-device collaboration framework for efficient on-device OOD detection
without requiring device-side backpropagation. SecDOOD utilizes cloud resources
for model training while ensuring user data privacy by retaining sensitive
information on-device. Central to SecDOOD is a HyperNetwork-based personalized
parameter generation module, which adapts cloud-trained models to
device-specific distributions by dynamically generating local weight
adjustments, effectively combining central and local information without local
fine-tuning. Additionally, our dynamic feature sampling and encryption strategy
selectively encrypts only the most informative feature channels, largely
reducing encryption overhead without compromising detection performance.
Extensive experiments across multiple datasets and OOD scenarios demonstrate
that SecDOOD achieves performance comparable to fully fine-tuned models,
enabling secure, efficient, and personalized OOD detection on resource-limited
edge devices. To enhance accessibility and reproducibility, our code is
publicly available at https://github.com/Dystopians/SecDOOD.