Algorithms, Platforms, and Privacy Risks
Ā
š In todayās fast-moving digital world, nearly every aspect of finding a homeārenting, buying, applying for a mortgageāhappens online. While this shift brings convenience, it also introduces a new set of risks, especially for communities historically excluded from housing opportunities. Artificial intelligence, automation, and big data are reshaping the housing landscapeābut not always for the better.Ā
š§ Algorithms used in housing-related decisions may unintentionally reinforce racial and economic disparities. From how ads are targeted on social media to how tenant applications are scored, the tech that promises neutrality can sometimes magnify bias. The digital tools we use daily can sort, exclude, and prioritize people in ways that feel invisibleāand unfair.Ā
In this article, weāll explore how technology intersects with fair housing. We’ll examine AI in tenant screening š¢, discriminatory ad targeting on platforms like Facebook š£, the challenges of mortgage tech š¦, and the broader risks and opportunities that come with digital housing tools š. Understanding these dynamics is essential for protecting civil rights in the age of algorithms. Ā
š¢ Tenant Screening and AI BiasĀ
Many landlords now use third-party services that automate tenant screening, relying on machine learning models to assess applicant ārisk.ā These systems crunch data on credit, rental history, evictions, and even employment, spitting out scores that determine who gets ināor doesnāt.Ā
š But what happens when the data used to train these systems is already biased?Ā
Take, for example, Black and Hispanic households, which are disproportionately impacted by lower credit scores due to historical economic exclusion. When an AI system places heavy weight on credit, it can perpetuate that disparityāeven if thereās no explicit intent to discriminate.Ā
šØ In 2023, the Federal Trade Commission (FTC) issued warnings to landlords and screening companies that the use of AI must still comply with the Fair Credit Reporting Act (FCRA). This includes ensuring accuracy, transparency, and giving tenants the right to dispute inaccuracies.Ā
A major concern is theĀ lack of visibilityātenants often donāt know what factors were used against them, nor how to correct potential errors. The screening system becomes a “black box” that silently closes doors.Ā
š” Reform advocates suggest requiring companies to provide clearer disclosures, third-party audits of algorithmic tools, and human review of automated denials. Because when automation lacks accountability, it threatens access to housing for the very people fair housing laws were designed to protect. Ā
š£ Facebookās Ad-Targeting Lawsuit SettlementĀ
šÆ Social media platforms play a powerful role in how people see housing listings. But in 2019, it was revealed that Facebookās ad tools allowed real estate companies and landlords to exclude viewers based on race, gender, religion, and other protected characteristics.Ā
HUD sued Facebook, calling the practiceĀ digital redlining. While traditional redlining involved physical maps and banks, todayās redlining can occur with a few clicks in an ad-targeting dashboard.Ā
š¤ Facebook eventually settled the lawsuit and agreed to overhaul how housing ads are displayed. The platform created a new systemācalled theĀ Special Ad Categoryāthat restricts targeting for housing, credit, and job ads to prevent discrimination.Ā
š± This case was a turning point. It exposed how seemingly neutral digital tools can be weaponized to maintain segregation. And it showed that civil rights enforcement must evolve to meet the realities of the digital age.Ā
But many believe more is needed. Platforms like Google and Instagram must also be scrutinized, and federal regulators should proactively monitor how ads are deliveredānot just how theyāre set up. Ā
š¦ Mortgage Tech and Fair Lending AuditsĀ
Digital underwriting is rapidly transforming the mortgage industry. Today, lenders use AI models to assess borrower risk and streamline approvals. But just like tenant screening tools, mortgage tech can bake in bias if it relies on flawed assumptions or skewed data.Ā
š Algorithms might overvalue factors correlated with wealthālike large savings accounts or long credit historiesāwhile undervaluing rental payment history or gig economy income, which are more common among younger, minority, or lower-income applicants.Ā
In 2021, the Consumer Financial Protection Bureau (CFPB) raised concerns aboutĀ algorithmic discrimination in lending, noting that the lack of transparency in automated systems makes it harder to detect if fair lending laws are being violated.Ā
š This is whereĀ fair lending auditsĀ come in. Lenders must test whether their models produce disparate impacts across protected classes, and must correct them if they do. However, many fintech companies are not fully transparent, citing proprietary models.Ā
The push forĀ “explainable AI”Ā is gaining tractionāmeaning companies must be able to explain how their decisions are made. Because if youāre denied a mortgage, you should know why. Ā
š Risks and Opportunities in Digital Housing ToolsĀ
Technology can be a double-edged swordāespecially in housing. Here are some of the biggestĀ risksĀ andĀ opportunities:Ā
šŗ RisksĀ
- Privacy Invasion: Renters may be evaluated based on data they didnāt even know was being used, such as online behavior or geolocation data. ššµļøāāļø
- Algorithmic Opacity: Companies often guard their algorithms as trade secrets, making it difficult to challenge decisions. š
- Unequal Access: Not everyone has reliable internet or digital literacy, especially older adults or low-income families. š§š»
š¢ Opportunities
- Faster Access: Online applications and digital documents reduce paperwork and speed up approvals. š
- Wider Reach: Listings on Zillow, Apartments.com, and others increase visibility and choice. š
- Proactive Monitoring: AI can also detect discriminatory patterns if designed to do soāturning tech into a watchdog, not just a gatekeeper. š§ š
To maximize the benefits and reduce the harm, tech companies must partner with civil rights organizations, regulators, and consumers. Transparency and accountability are the cornerstones of fairness in the digital age.
Ā
Conclusion
Ā
āļø Technology holds great promise for streamlining and expanding access to housingābut only if it’s designed and monitored with fairness in mind. From biased algorithms to opaque advertising systems, weāve seen that digital tools can reproduce the very inequalities weāve fought to dismantle.
Ā
š We must recognize that discrimination today doesnāt always look like a slammed door or a āNo Vacancyā sign. Sometimes, it looks like a missing ad, a hidden score, or a rejected application with no explanation.
Ā
š¬ As consumers, advocates, and professionals, itās our responsibility to question how these tools are usedāand to push for systems that treat everyone with fairness and dignity.
Ā
Ā
Conclusion
Ā
āļø Technology holds great promise for streamlining and expanding access to housingābut only if it’s designed and monitored with fairness in mind. From biased algorithms to opaque advertising systems, weāve seen that digital tools can reproduce the very inequalities weāve fought to dismantle.
Ā
š We must recognize that discrimination today doesnāt always look like a slammed door or a āNo Vacancyā sign. Sometimes, it looks like a missing ad, a hidden score, or a rejected application with no explanation.
Ā
š¬ As consumers, advocates, and professionals, itās our responsibility to question how these tools are usedāand to push for systems that treat everyone with fairness and dignity.
Ā
šĀ Stay Connected & Take Action:
š„ Subscribe to this newsletter on LinkedIn, or follow the blog at
ā”ļøĀ www.ericfrazier.comĀ orĀ www.ericfrazieruk.com
ā”ļøĀ www.ericfrazier.comĀ orĀ www.ericfrazieruk.com
šŗ Watch our interviews and updates on YouTube:
ā”ļø youtube.com/thepowerisnow
ā”ļø youtube.com/thepowerisnow
š Need personalized advice or consultation? Whether you’re buying, selling, or building your business, I’m here to help.
Ā ā”ļø Schedule your free discovery call today:Ā https://calendly.com/ericfrazier/real-estate-mortgage-consultation-clients
Ā ā”ļø Schedule your free discovery call today:Ā https://calendly.com/ericfrazier/real-estate-mortgage-consultation-clients
Your trusted advisor in business and wealth.
āĀ Eric Lawrence Frazier, MBA
āĀ Eric Lawrence Frazier, MBA
š APA References:
- Federal Trade Commission. (2018, October 16).Ā Texas company will pay $3 million to settle FTC charges that it failed to meet accuracy requirements for its tenant screening reports. Retrieved fromĀ https://www.ftc.gov/news-events/news/press-releases/2018/10/texas-company-will-pay-3-million-settle-ftc-charges-it-failed-meet-accuracy-requirements-its-tenant
- U.S. Department of Housing and Urban Development. (2019, March 28).Ā HUD charges Facebook with housing discrimination over company’s targeted advertising practices. Retrieved fromĀ https://www.hud.gov/press/press_releases_media_advisories/HUD_No_19_035
- Federal Trade Commission. (2023, February 28).Ā FTC warns landlords using AI rental screening may violate fair credit laws. Retrieved fromĀ https://www.ftc.gov/news-events/news/press-releases/2023/02/ftc-warns-landlords-using-ai-rental-screening-may-violate-fair-credit-laws
- Meta Platforms. (2022, June 21).Ā Meta to rework housing ad system under DOJ discrimination settlement. Retrieved fromĀ https://www.axios.com/2022/06/21/meta-doj-housing-ads-discrimination-settlement
- Federal Trade Commission. (2023, March).Ā Privacy and data security update. Retrieved fromĀ https://www.ftc.gov/system/files/ftc_gov/pdf/2024.03.21-PrivacyandDataSecurityUpdate-508.pdf
Ā