Keywords

Genetic Algorithms, Machine Learning, AI, Finance

Abstract

In the natural world, the earliest form of learning and optimization emerged through evolution, with species adapting only across generations rather than within a single lifespan. Over time, the development of complex brains allowed individuals to learn directly from their environments, accelerating adaptation. Finally, the advent of language enabled knowledge to be exchanged among individuals, vastly increasing collective intelligence. In today’s computational landscape, this evolutionary trajectory is mirrored by the rise of neural networks and, more recently, Transformer architectures, which capture increasingly sophisticated patterns from data. Yet, even these advanced models benefit from higher-level optimization to fine-tune hyperparameters and adapt to dynamic problem spaces. This thesis introduces a Genetic Algorithm (GA) framework positioned at the top of this “evolutionary pyramid,” leveraging population-based search to optimize neural networks and Transformers alike. By integrating classic genetic operators—selection, crossover, and mutation—with a novel dynamic population scaling mechanism, the GA efficiently navigates non-stationary environments that would otherwise challenge static approaches. This is especially pertinent in quantitative finance, where market volatility demands continual hyperparameter tuning. Through a bitstring-based representation, the framework achieves fine-grained control over critical parameters, enabling exploration of a vast configuration space without incurring prohibitive computational costs. The synergy between a GA-driven evolutionary layer and the “brains” of neural networks yields robust performance gains, evidenced by empirical results demonstrating improved accuracy and reduced overhead compared to traditional tuning methods. iii Overall, this evolutionary optimizer underscores the importance of adaptive strategies that can scale and respond to real-time feedback, bridging the gap between fixed hyperparameter settings and the dynamic demands of complex tasks. By situating itself as the top layer of an evolutionary hierarchy, the proposed GA framework paves the way for more flexible, responsive, and efficient AI systems—enhancing theoretical understanding and practical outcomes in domains ranging from finance to broader machine learning applications. Ultimately, this work highlights the transformative potential of evolutionary optimization in shaping the future of adaptive, self-improving AI.

Completion Date

2025

Semester

Spring

Committee Chair

Borowczak, Mike

College

College of Engineering and Computer Science

Identifier

DP0029267

Document Type

Dissertation/Thesis

Share

COinS