Knowledge Distillation Before Formal Verification of Deep Neural Networks / Jordan Perr-Sauer.

This thesis explores the potential in applying knowledge distillation to overcome the scaling problem in formal verification of deep neural networks.Instead of verifying the large neural network directly, we first compress the neural network with knowledge distillation and then verify the compressed...

Full description

Saved in:
Bibliographic Details
Online Access: Connect to online resource
Main Author: Perr-Sauer, Jordan (Author)
Format: Thesis Electronic eBook
Language:English
Published: Ann Arbor : ProQuest Dissertations & Theses, 2022
Subjects:

Internet

Connect to online resource

Online

Holdings details from Online
Available