Faculty Advisor

Dr. Daniel Stephens

Keywords

Digital Redlining, Smart Cities, Artificial Intelligence, Housing Inequality, Fair Housing Act, Algorithmic Governance, Civil Rights Law, Urban Studies

Abstract

Artificial intelligence is increasingly used in urban housing systems, where it shapes decisions about tenant screening, rent pricing, lending, zoning, and neighborhood investment. Although these tools are often promoted as efficient and impartial, they frequently rely on historical data that reflect racial, economic, and spatial inequality. As a result, AI systems can reproduce discriminatory outcomes even when protected characteristics are not directly used. This paper examines digital redlining in the smart city and argues that algorithmic housing tools mirror long standing structural inequities that raise significant concerns under fair housing and civil rights law. It evaluates how automated screening, predictive analytics, and rent optimization can restrict access to housing while reducing transparency and limiting opportunities for appeal. The analysis shows that without stronger oversight and clearer accountability, AI driven systems risk normalizing discriminatory practices under the appearance of neutrality. The paper concludes by calling for legal and policy reforms that center fairness, accountability, and equitable access in the governance of housing technologies.

Date Added

2026

Rights

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

College

College of Sciences

Share

COinS