GuardNN: Secure Accelerator Architecture for Privacy-Preserving Deep Learning
Abstract
This paper proposes GuardNN, a secure DNN accelerator that provides hardware-based protection for user data and model parameters even in an untrusted environment. GuardNN shows that the architecture and protection can be customized for a specific application to provide strong confidentiality and integrity guarantees with negligible overhead. The design of the GuardNN instruction set reduces the TCB to just the accelerator and allows confidentiality protection even when the instructions from a host cannot be trusted. GuardNN minimizes the overhead of memory encryption and integrity verification by customizing the off-chip memory protection for the known memory access patterns of a DNN accelerator. GuardNN is prototyped on an FPGA, demonstrating effective confidentiality protection with ~3% performance overhead for inference.
- Publication:
-
arXiv e-prints
- Pub Date:
- August 2020
- DOI:
- arXiv:
- arXiv:2008.11632
- Bibcode:
- 2020arXiv200811632H
- Keywords:
-
- Computer Science - Cryptography and Security;
- Computer Science - Hardware Architecture;
- Computer Science - Machine Learning
- E-Print:
- Accepted to the 59th Design Automation Conference (DAC'22)