The Department of Housing and Urban Development last week took a swing at Facebook over the potential for discriminatory housing ads on its platform — and reverse mortgage originators and lenders would be wise to pay attention.
“I think it’s a shot across the bow by HUD,” Jim Milano, a partner at the Washington, D.C. law firm of Weiner Brodsky Kider PC and co-general counsel to the National Reverse Mortgage Lenders Association, told RMD.
The federal government took specific issue with Facebook’s filtering tools, which allow advertisers to precisely target specific types of potential customers when planning campaigns. But that can be a problem for any housing-related company — including lenders, real estate agents, and landlords — because of federal protections under the Fair Housing Act and the Equal Credit Opportunity Act.
In its complaint, HUD accused Facebook of enabling housing advertisers to block protected groups of people from seeing their content. For instance, a lender could elect not to show its ads to users identified as interested in topics such as religions or ethnic groups, as well as those who lived in certain ZIP codes.
“The Fair Housing Act prohibits housing discrimination, including those who might limit or deny housing options with a click of a mouse,” Anna María Farías, HUD’s assistant secretary for fair housing and equal opportunity, said in a statement last week. “When Facebook uses the vast amount of personal data it collects to help advertisers to discriminate, it’s the same as slamming the door in someone’s face.”
Discriminatory housing advertisements are nothing new, but Milano said that HUD’s decision to go after the medium, and not the messenger, is. For instance, if a newspaper ran an advertisement for a lender that blatantly discriminated against a specific ethnic or racial group, Milano wouldn’t expect the government to go after the paper’s publisher or employees — instead, officials would take swift action against the offending lender.
“That’s what makes this case against Facebook interesting. … Users of Facebook need to wake up and look at this,” he said.
The Home Equity Conversion Mortgage represents a curious case in the fair-housing space, as by law it’s only available to a specific age group: people aged 62 and older. For that reason, Milano said, HECM lenders don’t have to worry about running afoul of anti-discrimination laws if they use online tools that target only homeowners above the threshold; the ECOA offers a special carve-out for these loans.
“Certainly, in the reverse mortgage industry, people want to make loans. They don’t care what your ethnicity is, what color your skin is, but they do care about how old you are — that you’re HECM qualified,” he said.
But in Milano’s view, the biggest danger on Facebook and other social media platforms comes from the unintended consequences that selecting certain filtering options could have for lenders. A well-meaning originator who decides to restrict advertising to certain ZIP codes where HECMs are particularly popular, for instance, could be guilty of unintentionally redlining — a practice that unscrupulous banks have used to control the racial makeup of neighborhoods for decades by drawing clear lines around where they would and would not lend.
“If those geographic areas are not broad enough, and are not based on legitimate criteria, you could run into trouble,” Milano said.
To follow the law and prevent even unintentional discrimination, lenders should always ensure that any geographic-based advertising has a solid, legal reason behind it. An originator wouldn’t get in trouble for deliberately declining to seek out users in states where she wasn’t licensed to offer loans, for example, but could if the government found that she was only advertising in certain metropolitan statistical areas (MSAs) or ZIP codes without a non-discriminatory reason.
The ongoing problem for originators, however, is that it can be difficult to think through all the potential ways that clicking a specific box on an advertising platform could bring about negative outcomes and regulatory scrutiny.
“It’s hard to do, particularly in the new technology, because there could be a thousand filters, and they could be based on esoteric things that people would not normally think about,” Milano said.
Written by Alex Spanko