Keywords
Adversarial attacks; Defense mechanisms; Activation pattern analysis; CIFAR-10 dataset; Deep learning
Abstract
Convolutional Neural Networks (CNNs) have been at the frontier of the revolution within the field of computer vision. Since the advent of AlexNet in 2012, neural networks with CNN architectures have surpassed human-level capabilities for many cognitive tasks. As the neural networks are integrated in many safety critical applications such as autonomous vehicles, it is critical that they are robust and resilient to errors. Unfortunately, it has recently been observed that deep neural network models are susceptible to adversarial perturbations which are imperceptible to human vision. In this thesis, we propose a solution to defend neural networks against white box adversarial attacks. The proposed defense is based on activation pattern analysis in the frequency domain. The technique is evaluated and compared with state-of-the-art techniques on the CIFAR-10 dataset.
Notes
If this is your thesis or dissertation, and want to learn how to access it or for more information about readership statistics, contact us at STARS@ucf.edu
Graduation Date
2022
Semester
Summer
Advisor
Ewetz, Rickard
Degree
Master of Science (M.S.)
College
College of Engineering and Computer Science
Department
Computer Science
Degree Program
Computer Science
Identifier
CFE0009258; DP0026862
URL
https://purls.library.ucf.edu/go/DP0026862
Language
English
Release Date
August 2022
Length of Campus-only Access
None
Access Status
Masters Thesis (Open Access)
Subjects
Neural networks (Computer science)--Statistical methods; Neural networks (Computer science)--Research; Neural networks (Computer science)--Design and construction; Computer vision--Research; Deep learning (Machine learning)
STARS Citation
Shah, Sharvil, "Methods For Defending Neural Networks Against Adversarial Attacks" (2022). Electronic Theses and Dissertations, 2020-2023. 1287.
https://stars.library.ucf.edu/etd2020/1287
Included in
Accessibility Statement
This item was created or digitized prior to April 24, 2027, or is a reproduction of legacy media created before that date. It is preserved in its original, unmodified state specifically for research, reference, or historical recordkeeping. In accordance with the ADA Title II Final Rule, the University Libraries provides accessible versions of archival materials upon request. To request an accommodation for this item, please submit an accessibility request form.