Facebook sparked fresh criticism over how much influence it may have on elections with a new policy that enables users to better control what ads they see but doesn’t limit how political ads are targeted.
New features will enable people to see fewer political — and social issue — ads on both the company’s Facebook and Instagram services, Rob Leathern, director of product management, said in a statement released Thursday. The control on political ads will be introduced in the U.S. in early summer, then expand to more locations.
Facebook, Twitter and Alphabet are among the Big Tech companies increasingly scolded by politicians and activists for enabling the spread of misleading information and false news. While Twitter and Alphabet’s Google have agreed to restrict or block political advertising, Facebook has declined to do so, saying it would prefer to see regulation that covers the issue.
“We don’t think decisions about political ads should be made by private companies, which is why we are arguing for regulation that would apply across the industry,” Leathern said. “We believe the sooner Facebook and other companies are subject to democratically accountable rules on this the better.”
Facebook’s new policy provoked swift backlash, even from media outlets. The New York Times headline said Facebook won’t “back down from allowing lies” while the Associated Press said the company “again refuses to ban political ads, even false ones.”
U.S. Federal Election Commissioner Ellen Weintraub offered sharp criticism, tweeting that Facebook’s “weak plan suggests the company has no idea how seriously it is hurting democracy.”
Facebook has attempted to show it wants to provide users with reliable information. In October it introduced a news section that includes stories from mainstream publications deemed to be reliable. It also instituted new policies — including the clear labeling of fake news and state-controlled media — to help users interpret what they read.
- Facebook board member and billionaire libertarian Peter Theil has been seen as influencing CEO Mark Zuckerberg as the company hews to a policy of accepting and not fact-checking.
- An experiment by the NATO Strategic Communication Centre of Excellence found that 80% of fake clicks, likes and followers for 105 different posts on Facebook, Instagram, Twitter and YouTube were still online after four weeks, while the platforms mostly also failed to respond when told about the false engagements.