Knowledge Distillation

A Knowledge Distillation-Based Backdoor Attack in Federated Learning

Authors: Yifan Wang, Wei Fan, Keke Yang, Naji Alhusaini, Jing Li | Published: 2022-08-12
Backdoor Attack
Knowledge Distillation

Trojan Horse Training for Breaking Defenses against Backdoor Attacks in Deep Learning

Authors: Arezoo Rajabi, Bhaskar Ramasubramanian, Radha Poovendran | Published: 2022-03-25
Trojan Horse Signature
Knowledge Distillation
Defense Method

EI-MTD:Moving Target Defense for Edge Intelligence against Adversarial Attacks

Authors: Yaguan Qian, Qiqi Shao, Jiamin Wang, Xiang Lin, Yankai Guo, Zhaoquan Gu, Bin Wang, Chunming Wu | Published: 2020-09-19 | Updated: 2020-11-25
Dynamic Service Scheduling
Adversarial Example
Knowledge Distillation

Online Robustness Training for Deep Reinforcement Learning

Authors: Marc Fischer, Matthew Mirman, Steven Stalder, Martin Vechev | Published: 2019-11-03 | Updated: 2019-11-22
Poisoning
Improvement of Learning
Knowledge Distillation