Abstract

Although there is a great success of applying deep learning on a wide variety of tasks, it heavily relies on a large amount of labeled training data, which could be hard to obtain in many real scenarios. To address this problem, unsupervised and semi-supervised learning emerge to take advantage of the plenty of cheap unlabeled data to improve the model generalization. In this dissertation, we claim that equivariant and invariance are two critical criteria to approach robust unsupervised and semi-supervised learning. The idea is as follows: the features of a robust model ought to be sufficiently informative and equivariant to transformations on the input data, and the classifiers should be resilient and invariant to small perturbations on the data manifold and model parameters. Specifically, features are learnt via auto-encoding the transformations on the input data, and models are regularized through minimizing the effects of perturbations on features or model parameters. Experiments on several benchmarks show the proposed methods outperform many state-of-the-art approaches on unsupervised and semi-supervised learning, proving importance of the equivariance and invariance rules for robust feature representation learning.

Notes

If this is your thesis or dissertation, and want to learn how to access it or for more information about readership statistics, contact us at STARS@ucf.edu

Graduation Date

2020

Semester

Spring

Advisor

Wang, Liqiang

Degree

Doctor of Philosophy (Ph.D.)

College

College of Engineering and Computer Science

Department

Computer Science

Degree Program

Computer Science

Format

application/pdf

Identifier

CFE0008065; DP0023204

URL

https://purls.library.ucf.edu/go/DP0023204

Language

English

Release Date

May 2020

Length of Campus-only Access

None

Access Status

Doctoral Dissertation (Open Access)

Share

COinS