Knowledge Distillation Before Formal Verification of Deep Neural Networks / Jordan Perr-Sauer.
This thesis explores the potential in applying knowledge distillation to overcome the scaling problem in formal verification of deep neural networks.Instead of verifying the large neural network directly, we first compress the neural network with knowledge distillation and then verify the compressed...
Saved in:
Online Access: |
Connect to online resource |
---|---|
Main Author: | |
Format: | Thesis Electronic eBook |
Language: | English |
Published: |
Ann Arbor :
ProQuest Dissertations & Theses,
2022
|
Subjects: |