The Fair Housing Act of 1968 (and HUD's 2016 disparate-impact guidance) prohibits housing discrimination based on race, color, religion, sex, disability, familial status, and national origin.
For a real estate platform, this isn't just a moral commitment — it's a structural design constraint.
What our platform blocks (and why)
- No filter on familial status — you cannot filter listings by "no children" or similar. Familial status discrimination is illegal.
- No school-only filter — school quality is a legitimate concern, but school-score-only filtering disproportionately impacts protected classes (HUD 2016 disparate impact). Our platform requires supporting geographic context (state, city, or polygon) before applying school filters.
- No ZIP-only filter — historical ZIP boundaries can serve as race proxies (redlining legacy). Same supporting-context rule applies.
- No "religious community" search — terms like "Christian community" or "near a temple" trigger our steering language scanner before listing publish.
Why this matters for you
- You'll never see a listing that uses steering language ("perfect for families", "exclusive neighborhood").
- Our search algorithm doesn't surface listings differently based on your demographic guesses (no race/gender/age inference).
- Lead routing logs every assignment to detect disparate-impact patterns.
The audit trail
Every search query, every listing publish, every showing request is logged in compliance_event_log — append-only, hash-chained, exported nightly to WORM storage.
If a Fair Housing complaint arises (state attorney general or HUD investigation), we can reconstruct exactly what filters, listings, and routing decisions affected each user. This is your protection too.