Algorithmic Bias and the New Face of Housing Discrimination.
By Eric Lawrence Frazier, MBA
Every person who writes code brings their whole life to every line. Their education, their experience, their assumptions, their blind spots, and their cultural reality are present in every decision about what to measure, what to optimize, and what to treat as a proxy for creditworthiness, reliability, or risk. To claim that the systems they build are colorblind is not ignorance. It is politics. It is the deliberate shedding of accountability for outcomes that were foreseeable from the moment the training data was assembled.
The automated underwriting systems that govern mortgage approval decisions — Fannie Mae’s Desktop Underwriter, Freddie Mac’s Loan Product Advisor — are trained on historical loan performance data. That history is the history of American mortgage lending. The same history that produced redlining, blockbusting, racially restrictive covenants, the GI Bill exclusion, and the systematic steering of Black borrowers into subprime products when they qualified for prime. The system learns from that record. It learns which borrower profiles historically produced defaults. And those profiles were shaped, in large part, not by the financial characteristics of the borrowers but by their exclusion from the products and opportunities that build the credit records the system is designed to reward.
The data always reveals what the language conceals. The homeownership gap is 29 percentage points wide. The wealth gap has widened. The Fair Housing Act is on the books. The automated underwriting systems are certified neutral. And the outcomes are as unequal as they were before any of these systems existed. That is not coincidence. That is the system working exactly as designed — with the discrimination embedded in the inputs rather than announced in the outputs.
What the Facebook Case Documented
In 2016, the National Fair Housing Alliance and investigative journalists at ProPublica began documenting something that the technology industry insisted was impossible: that Facebook’s advertising algorithm was steering housing advertisements away from Black and Hispanic users without any explicit instruction to do so. The mechanism was the algorithm’s machine learning system, which optimized ad delivery based on predicted engagement. It used behavioral data — what users clicked on, how long they spent reading certain content, what they engaged with — to predict which users within a targeted audience were most likely to respond to an ad.
That behavioral data was not racially neutral. It was shaped by decades of segregated experience. Black users in communities that had been systematically excluded from homeownership information, from banking relationships, from real estate marketing, exhibited different behavioral patterns around housing content than white users in communities where that information had always flowed freely. The algorithm read those patterns as signals of likely engagement and steered accordingly. Housing ads went to white users. Black and Hispanic users, who may have been equally or more interested in buying a home, did not receive them.
No one at Facebook wrote a line of code that said to discriminate by race. The discrimination was not in the instruction. It was in the training data, which carried the history of American segregation inside it, and in the optimization function, which treated the patterns produced by that history as signals to be amplified rather than artifacts to be corrected.
HUD filed a charge. The settlement came in 2022. Meta agreed to overhaul its ad delivery system for housing, employment, and credit categories and to submit to regular audits. The settlement did not eliminate the problem. It documented it, created a compliance framework, and left the underlying architecture in place. The researchers who have continued testing these systems have found that algorithmic discrimination in digital housing advertising persists across multiple platforms.
The FinTech Evidence
The Facebook case was about who receives information about housing. A parallel body of research documents what happens when Black and Hispanic borrowers actually apply for credit. A 2022 study published in the Journal of Financial Economics analyzed 9 million mortgage records and found that FinTech lenders — the algorithmic, online-first platforms positioned as the bias-free alternative to traditional banking — charged Black and Hispanic borrowers approximately 8 basis points more than similarly qualified white borrowers. That differential generated an estimated $765 million in excess interest payments annually. The systems were neutral. The outcomes were not.
The National Fair Housing Alliance and its member organizations have been testing these systems systematically — creating matched pairs of testers, documenting differential treatment, filing complaints, and pursuing settlements. The cases they win matter. The deterrent effect of the cases they document matters. But the scale of algorithmic decision-making in 2026 — the millions of lending decisions, advertising impressions, and search results processed every day by systems that carry the history of American discrimination inside them — exceeds what any enforcement organization can match case by case.
Your Practical Answer in 2026
The first-generation Black homebuyer navigating this landscape needs to understand the environment clearly. The bias is real. It is embedded in systems that will not announce themselves as biased. It will show up as a denial that comes with no explanation, an interest rate that is slightly higher than a white colleague with an identical profile received, a search result that consistently steers you toward certain neighborhoods and away from others, and an ad for a down payment assistance program that never reaches you because the algorithm decided you were not the target audience.
Work with people you know, trust, and like — people who look like you and understand your experience. The loan officer who has navigated these systems themselves, who has helped clients through the resistance, who knows where the walls are and how to get around them. The real estate professional who has worked in the communities you are trying to buy into and understands the informal dynamics of how listings move and how offers are received. Your community is your infrastructure. Build within it first.
There are three kinds of people who face resistance. The first sees the wall and turns around. The second fight with courage, run out of energy, and give up. The third is not to stop. For them, it is death before dishonor. They will not allow the people who do not want to see them succeed to have the victory. Those people are our ancestors. They endured violence, theft, systematic exclusion, and every tactic that a society organized around their subordination could devise. And they still found a way to buy land, build businesses, educate their children, and leave something behind.
The algorithm is a wall. It is not the first wall. It is not the highest wall. Prepare your file so thoroughly that no automated system has a legitimate basis for denial. Credit above the threshold. Debt eliminated. Income documented. Down payment sourced and seasoned. Then find the professional who knows how to present that file and who will fight for you when the system pushes back. The bias is in the system. The determination is in you. Take your best shot, America — we can handle it.
Poetry says the rest.
We Achieve in Spite Of
The person at the keyboard brings their whole life to the line.
Their knowledge and their limits and their culture by design.
To call the system colorblind is politics, not fact.
It sheds the accountability for every outcome tracked.
The gap is in the data, whether anyone admits.
The homeownership numbers and the wealth gap never fits.
The law is on the books, and yet the results tell a different tale.
We have the system and the law — but equity is still for sale.
The system is not colorblind — the data makes it plain.
They built it on a history, and history has a name.
The ancestors endured far worse, and still they found the way.
We achieve in spite of everything. We build. We stay.
They built a system learning from the patterns people made.
The patterns came from segregation’s decades-long blockade.
No one wrote a line of code that said exclude by race.
The algorithm found the history and put it into place.
The ad went to the white user and away from those who matched.
The test was run and documented, and the settlement was attached.
Three hundred thirty-five million showed disparate impacts’ reach.
The bias hid inside the code, but outcomes broke the breach.
The system is not colorblind — the data makes it plain.
They built it on a history, and history has a name.
The ancestors endured far worse, and still they found the way.
We achieve in spite of everything. We build. We stay.
Find the loan officer who looks like you and knows the road.
Who has navigated every wall and carried that same load.
Who understands the resistance you will face inside the gate.
Work with those who know your story — that is how you navigate.
There are those who see the wall and turn and walk away in fear.
There are those who fight with courage then run out of will to persevere.
And then there are the ones who will not stop for any cost.
Death before dishonor — and the victory is not lost.
The system is not colorblind — the data makes it plain.
They built it on a history and history has a name.
The ancestors endured far worse and still they found the way.
We achieve in spite of everything. We build. We stay.
They endured the violence, the threat, the theft, the chain.
They built on ground that was not theirs and built it up again.
Every wall that stood against them only proved the point —
That the spirit cannot be contained and will not disappoint.
So bring your bias and your algorithm and your code.
Bring your steering and your redlining and your digital abode.
The determined ones among us will not flinch and will not flee.
Take your best shot America — we were built for this. We’re free.
The system is not colorblind — the data makes it plain.
They built it on a history, and history has a name.
The ancestors endured far worse, and still they found the way.
We achieve in spite of everything. We build. We stay.
They hid the bias in the code and called the system clean.
The data told the truth about the places in between.
You cannot code away a history that still has not healed.
Take your best shot America — we will not yield.
The system is not colorblind — the data makes it plain.
They built it on a history, and history has a name.
The ancestors endured far worse, and still they found the way.
We achieve in spite of everything. We build. We stay.
So know the bias, name the system, document the harm.
Then find your people, build your file, and meet them with no alarm.
The spirit of the ancestors runs deep in everything we do.
Take your best shot. We have handled worse. We will handle this too.
The system is not colorblind — the data makes it plain.
They built it on a history, and history has a name.
The ancestors endured far worse, and still they found the way.
We achieve in spite of everything. We build. We stay.
CONTINUE THE CONVERSATION
Watch the 2026 Fair Housing Series
Airing now on The Power Is Now TV Network — Roku, Fire TV, Apple TV, and ericfrazieruk.com.
Attend the Real Estate Seminar — April 26
Eric Lawrence Frazier, MBA, is hosting a live real estate seminar on April 26. In person at Coldcutz Barbershop in Riverside, California, or join us online. Details and registration at EricFrazier.com.
Become a Member
Membership is free to start. Full access to the series, the magazine, free books, and the community at ericfrazieruk.com.
Get the Books
The Credit Handbook and How to Run Your Household Like a Business at EricFrazier.com/bookstore.
The Power Is Now Real Estate News — 24/7
Available around the clock on The Power Is Now TV Network and at ericfrazieruk.com.
Eric Lawrence Frazier, MBA
Your trusted advisor in business and wealth advisor
www.ericfrazier.com | www.ericfrazieruk.com
NMLS #451807 | CA DRE #01143484
Schedule a consultation: https://calendly.com/ericfrazier/real-estate-mortgage-consultation-clients
References for this essay are available at thepowerisnow.com/fairhousing2026