The Banker Who Knew Your Name: How Mortgage Approval Went From a Handshake to an Algorithm
The Man in the Suit Who Decided Your Future
Walk into a bank in 1965, and you'd find yourself in a wood-paneled office across from a loan officer—typically a man in his fifties who'd been in the business for thirty years. He held your financial destiny in his hands, and he made decisions the way people made most decisions then: by feel, by instinct, by what he thought he knew about you.
He'd read your application. He'd ask questions. Where did you work? How stable was that job? What did your wife do? (Yes, he'd ask about your wife, even if you were the one applying.) Were you the type of person who paid your bills on time? He'd check with your employer. He'd call your references. Sometimes he'd even drive by your current house to see if you kept up the yard.
Then he'd decide. Not based on a score or a formula, but on his judgment. His gut. Whether he thought you were the kind of person who'd pay back a loan.
It worked, in the sense that mortgages got approved and most people paid them back. But it also worked in ways we've largely tried to forget. That loan officer's gut feeling had a name: discrimination.
When "Fit" Meant Something Darker
The mortgage approval process of the mid-twentieth century was a masterclass in gatekeeping dressed up as prudence. Loan officers could—and did—deny mortgages to Black families trying to move into white neighborhoods. They rejected single women, assuming they'd quit work to have babies. They turned away immigrants, Catholics, and Jews based on stereotypes about financial reliability that had nothing to do with actual credit risk.
The system wasn't transparent. There was no appeals process. You didn't get a score; you got a "no." And often, you never really knew why.
Worse, these decisions shaped America's geography. Entire neighborhoods were redlined—literally marked on maps as too risky for mortgage lending. The wealth those mortgages would have built never materialized for the families locked out. The segregation it enforced lasted for generations.
The loan officer's "feel for people" was really a feel for people who looked and lived like him.
The Arrival of the Score
By the 1980s and 1990s, the mortgage industry started to change. Credit scores emerged—first as a curiosity, then as the standard. Underwriting became automated. Software could now do what the loan officer did, but faster, more consistently, and theoretically without bias.
Today, you can apply for a mortgage online. Within minutes—sometimes within hours—you get a decision. No conversation. No handshake. No loan officer who's known your family for twenty years. No loan officer who's never heard of your neighborhood and assumes the worst.
The process is so fast that many people don't even realize they've been approved until the email arrives.
What We Gained
The shift to algorithmic lending has been genuinely liberating for millions of people. A single mother in Phoenix can apply for a mortgage without worrying that a loan officer will decide she's not "stable" enough. An immigrant family doesn't have to navigate the prejudices of a local banker. A young couple in rural Montana can get pre-approved without driving three hours to the nearest bank.
The process is faster, cheaper, and more accessible. It's also far more consistent. A person with a 720 credit score gets the same terms in Mississippi that they'd get in Massachusetts.
And yes, it's more fair. The algorithm doesn't care about your race, your gender, your religion, or your zip code—at least, not intentionally. Those biases that shaped lending decisions for generations? They're gone, or they're supposed to be.
What We Lost (and What We Didn't Notice)
But something disappeared along with the loan officer's judgment. The possibility of a human exception. The chance to explain your story.
If your credit score took a hit because of a medical emergency three years ago, the algorithm knows that. It doesn't care why. If you've been self-employed for five years and your income fluctuates, the algorithm sees inconsistency. It doesn't see resilience or entrepreneurship.
The algorithm is also only as fair as the data it's trained on. If historical lending data reflects decades of discrimination, the algorithm can learn to replicate it—just more efficiently. Studies have shown that automated lending systems sometimes perpetuate the same racial disparities that human loan officers created, just in ways that are harder to see and harder to challenge.
And there's something else: the loss of relationship. Fifty years ago, your loan officer was part of your community. He knew your employer. He understood local economic conditions. He took a risk on you as a person, not a number. If you hit hard times, you could talk to him. He might work with you.
Today, your lender might be a corporation in another state. If you fall behind, you're dealing with a servicing company that's bound by rules. There's no discretion. There's no relationship to appeal to.
The Algorithm's Blind Spots
Here's the thing that gets interesting: the algorithm isn't actually more objective than the loan officer. It's just objectively biased in different ways.
A human loan officer might discriminate based on race. An algorithm might discriminate based on zip code, which often amounts to the same thing. A loan officer might give a break to someone with potential. An algorithm has no concept of potential.
We've traded visible, personal bias for invisible, systemic bias. We've gained speed and consistency at the cost of nuance and mercy.
The Now and Then
Then: You sat in an office. You made your case. A man decided if you were trustworthy. The system was personal, often discriminatory, and offered no recourse.
Now: You fill out a form online. An algorithm decides in minutes. The system is fast, consistent, and theoretically fair—but it's also impersonal and leaves no room for exceptions.
Neither system is perfect. The loan officer's gut feeling could be wise or prejudiced. The algorithm's logic can be fair or can replicate historical injustice in a new form.
But what's clear is this: we've made mortgages more accessible to people who were once locked out. We've also made the process more distant, more rule-bound, and arguably less forgiving. The human element that made lending a relationship has been replaced by the efficiency of a system that sees you as a data point.
Maybe that was necessary. Maybe it was worth it. But it's worth noticing what we left behind when the loan officer finally left the room.