Facebook agrees to hide demographics from landlords, creditors looking to advertise

Landlords will no longer be able to target ads to certain demographics

The federal government, under the U.S. Department of Housing and Urban Development, had filed its own complaint against Facebook for violating the Fair Housing Act, which prohibits businesses from which prohibits the sale or rental of housing based on such factors as race or religion.

The company also was accused of allowing employers to discriminate against women by targeting job listings toward men only, specifically in traditionally male-dominated fields.

The complaints followed a 2016 report by ProPublica that found Facebook allowed advertisers to exclude anyone with an “affinity” for African-American, Asian-American or Hispanic people.

The 1968 Fair Housing Act of 1968 says its illegal "to make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.”

Facebook said in a blog post Tuesday by Sheryl Sandberg, the company's chief operating officer, that it already prohibited certain categories from being used in advertising such as race, ethnicity, sexual orientation and religion. The company said it removed the option for landlords, employers and creditors to reach people with an "affinity for African-American or Hispanic-related content" about a year ago.

Under the new rules, a company looking to run housing, employment or credit ads also won’t be able to target Facebook members based on age, gender or zip code. And, Facebook said it was also building a tool that would let people search for all housing ads in the U.S. regardless of whether the ads are shown.

“Housing, employment and credit ads are crucial to helping people buy new homes, start great careers, and gain access to credit,” Facebook COO Sheryl Sandberg wrote in the company’s online post. “They should never be used to exclude or harm people.”

A joint statement by the advocacy groups, including the National Fair Housing Alliance, said in a statement that the agreement represented a "significant and historic precedent for big data and tech companies" because it makes clear the information they collect can't be used to discriminate.

"Companies must understand that depending on how data is being used, it can harm people and communities," said Fred Freiberg, director of the Fair Housing Justice Center of New York. "This agreement will help other companies that rely on algorithms and data for a range of services and operations to carefully consider whether their policies, products, and platforms are illegally discriminating against consumers."