Keywords

control theory, stabilizability, partial stabilizability, stochastic linear quadratic, linear quadratic regulator, stochastic control theory

Abstract

Optimal Control Theory, a branch of Control Theory, is applicable in fields such as engineering, operations research, and economics. Stochastic Optimal Control deals with noisy systems and data using Ito’s formulation. Given a noisy system and a cost functional, the goal is to find a control that will minimize the cost. This thesis focuses on linear quadratic stochastic optimal control, and we explore state equations that are not stabilizable. We first address measurability concerns arising from the semigroup property of the state trajectory. The notions of partial stability and partial stabilizability are introduced, and we formulate their corresponding Lyapunov and Riccati Equations. We then develop necessary conditions for open-loop solvability and characterize closed-loop solvability.

Thesis Completion Year

2025

Thesis Completion Semester

Spring

Thesis Chair

Yong, Jiongmin

College

College of Sciences

Department

Mathematics

Thesis Discipline

Control Theory

Language

English

Access Status

Open Access

Length of Campus Access

None

Campus Location

Orlando (Main) Campus

Share

COinS
 

Rights Statement

In Copyright