Keywords

control theory, stabilizability, partial stabilizability, stochastic linear quadratic, linear quadratic regulator, stochastic control theory

Abstract

Optimal Control Theory, a branch of Control Theory, is applicable in fields such as engineering, operations research, and economics. Stochastic Optimal Control deals with noisy systems and data using Ito’s formulation. Given a noisy system and a cost functional, the goal is to find a control that will minimize the cost. This thesis focuses on linear quadratic stochastic optimal control, and we explore state equations that are not stabilizable. We first address measurability concerns arising from the semigroup property of the state trajectory. The notions of partial stability and partial stabilizability are introduced, and we formulate their corresponding Lyapunov and Riccati Equations. We then develop necessary conditions for open-loop solvability and characterize closed-loop solvability.

Thesis Completion Year

2025

Thesis Completion Semester

Spring

Thesis Chair

Yong, Jiongmin

College

College of Sciences

Department

Mathematics

Thesis Discipline

Control Theory

Language

English

Access Status

Open Access

Length of Campus Access

None

Campus Location

Orlando (Main) Campus

Subjects

Stochastic control theory; Stochastic systems; Automatic control--Mathematics; Feedback control systems--Stability; Control theory--Mathematical models

Share

COinS
 

Accessibility Statement

This item was created or digitized prior to April 24, 2027, or is a reproduction of legacy media created before that date. It is preserved in its original, unmodified state specifically for research, reference, or historical recordkeeping. In accordance with the ADA Title II Final Rule, the University Libraries provides accessible versions of archival materials upon request. To request an accommodation for this item, please submit an accessibility request form.

Rights Statement

In Copyright