Keywords
Natural energy, markov decision process, mdp, load shedding, energy storage, intermittency, expected cost
Abstract
In modern power systems, renewable energy has become an increasingly popular form of energy generation as a result of all the rules and regulations that are being implemented towards achieving clean energy worldwide. However, clean energy can have drawbacks in several forms. Wind energy, for example can introduce intermittency. In this thesis, we discuss a method to deal with this intermittency. In particular, by shedding some specific amount of load we can avoid a total system breakdown of the entire power plant. The load shedding method discussed in this thesis utilizes a Markov Decision Process with backward policy iteration. This is based on a probabilistic method that chooses the best load-shedding path that minimizes the expected total cost to ensure no power failure. We compare our results with two control policies, a load-balancing policy and a less-load shedding policy. It is shown that the proposed MDP policy outperforms the other control policies and achieves the minimum total expected cost.
Notes
If this is your thesis or dissertation, and want to learn how to access it or for more information about readership statistics, contact us at STARS@ucf.edu
Graduation Date
2015
Semester
Spring
Advisor
Atia, George
Degree
Master of Science in Electrical Engineering (M.S.E.E.)
College
College of Engineering and Computer Science
Department
Electrical Engineering and Computer Science
Degree Program
Electrical Engineering
Format
application/pdf
Identifier
CFE0005635
URL
http://purl.fcla.edu/fcla/etd/CFE0005635
Language
English
Release Date
May 2015
Length of Campus-only Access
None
Access Status
Masters Thesis (Open Access)
STARS Citation
Jimenez, Edwards, "Resource allocation and load-shedding policies based on Markov decision processes for renewable energy generation and storage" (2015). Electronic Theses and Dissertations. 1140.
https://stars.library.ucf.edu/etd/1140