Being one of the largest tech companies in the world, Facebook is a lightning rod for criticism. Facebook has a massive amount of data on their users and they allow marketers and advertisers to use some of that data when targeting consumers. While this may not be very different from what all online ad systems do, the amount of data and the size of the network makes Facebook’s actions subject to greater scrutiny.
One of the company’s more recent legal challenges can serve as a warning for business owners, marketers and advertisers about the dangers of using big data for targeting purposes. In a recent lawsuit, Facebook and its advertising partners are being sued, saying they allegedly “used Facebook’s platform to discriminate based on race, gender, national origin and other protected attributes”.
The issue stems from a recent report from Pro Publica who released an investigation in October which claims some advertisers were creating adverts for the housing market that excluded African Americans, Asian Americans and Hispanics.
While this is more about the advertisers than Facebook, if the actions in the Pro Publica report are true, it may constitute a violation of the Fair Housing Act. Now Facebook allows a large amount of fine tuning in their targeting options, so it’s possible that some advertisers have used them in inappropriate ways. The suit claims that allowing advertisers to block users from seeing ads based on these groups, Facebook and the over 9,000 other defendants are in violation of the Fair Housing Act and the Civil Rights Act.
Facebook, of course, vigorously denies these sorts of allegations. In a statement to the BBC, a Facebook spokesperson wrote, “”We are committed to providing people with quality ad experiences, which includes helping people see messages that are both relevant to the cultural communities they are interested in and have content that reflects or represents their communities – not just generic content that’s targeted to mass audiences.
“We believe that multicultural advertising should be a tool for empowerment. We take a strong stand against advertisers misusing our platform: our policies prohibit using our targeting options to discriminate, and they require compliance with the law. We take prompt enforcement action when we determine that ads violate our policies.”
The fact that the case has a single plaintiff and not a class action suggests the case may not go very far. But the point here isn’t about the merits of the lawsuit. Rather, the lesson to be learned is about “optical issues”. Just because something is legal and ethical doesn’t mean it appears that way to everyone who see it. It’s like when people complain when they see uniformed cops eating in a restaurant as a waste of taxpayer money, when it’s just that cops get lunch breaks too.
Business owners, like Facebook has, may find themselves in a situation where something they do looks questionable to outsiders. For example, businesses that offer payday loans or reverse mortgages have reasons to target certain demographics, but this can backfire on them because of such optical issues. This doesn’t mean that this kind of targeting is always bad. For example, if a scholarship for minorities wanted to advertise on Facebook, it would be helpful if they weren’t wasting their advertising budget showing half their ads to the wrong demographic.
So it makes sense that Facebook allows this kind of targeting in certain situations. But as their statement above shows, they were aware of the possible optical issues. They were prepared with answers to these questions, even in a situation where they admit people can potentially misuse their services and still doing more to tackle the problem.”
In the end, the current suit against Facebook is likely to go nowhere. But there is still an important lesson that all business owners can take from Facebook’s handling of a situation they knew could be volatile.
For more recent news about Facebook, read this article on a possible change to the way Facebook shows ads to Facebook Groups.