Facebook whistleblower makes case for social media
LONDON (NYTIMES) – Frances Haugen, the former Facebook product manager-turned-whistleblower, appeared before British lawmakers on Monday (Oct 25), painting a portrait of a company vividly aware of its harmful effects on society but unwilling to act because doing so could jeopardise profits and growth.
Haugen’s more than two hours of testimony before the British Parliament was the latest step in her tightly choreographed campaign to build a case for stiffer oversight of the social media giant.
Hours before she began speaking in London, more than a dozen news organisations published articles based on the Facebook Papers, a cache of documents she took before resigning from the company.
In the coming weeks, Haugen is scheduled to meet with officials in France, Germany and the European Union about new laws that she says are necessary to force Facebook to recalibrate how it measures success more towards the public good.
“We need regulation,” Haugen said on Tuesday. “Until the incentives change, Facebook will not change,” she added later.
Even for Facebook, a company that has lurched from controversy to controversy since Mark Zuckerberg started it as a Harvard University undergrad in 2004, Haugen’s disclosures have created a backlash that stands apart.
The revelations have generated increased political support for new regulation in the United States and Europe, including some calls for Zuckerberg to step aside as Facebook’s CEO, putting Facebook on the defensive.
The growing rancour could lead to new government investigations and force the company to disclose more details about how its software works.
“Facebook is failing to prevent harm to children, it’s failing to stop the spread of disinformation, it is failing to stop the spread of hate speech,” John Nicolson, a lawmaker from Scotland, said during the hearing. “It does have the power to deal with these issues, it’s just choosing not to.”
Haugen left Facebook with scores of internal research, slide decks, discussion threads, presentations and memos that she has shared with lawmakers, regulators and journalists. The information provides an unvarnished view of how some within the company tried to raise alarms about its harmful effects, but often struggled to get Facebook leaders to act.
Facebook defended its practices and said it had spent US$13 billion and hired 40,000 people to work on safety issues.
“Contrary to what was discussed at the hearing, we’ve always had the commercial incentive to remove harmful content from our sites,” said Mitch Henderson, a company spokesperson. “People don’t want to see it when they use our apps and advertisers don’t want their ads next to it.”
After leaking internal company documents to The Wall Street Journal that resulted in a series of articles that began in September, she revealed her identity this month for an episode on “60 Minutes” and testified before a Senate committee. She also shared the documents with the Securities and Exchange Commission.
Since then, she has shared the Facebook materials with other news organisations, including The New York Times, resulting in additional stories about Facebook’s harmful effects, including its role in spreading election misinformation in the United States and stoking divisions in countries such as India.
Haugen’s visit to Europe is a reflection of the region’s aggressive approach to tech regulation and a belief that its policymakers are expected to act faster than the United States to pass new laws aiming at Facebook and other tech giants.
“For all the problems Frances Haugen is trying to solve, Europe is the place to be,” said Mathias Vermeulen, the public policy director at AWO, a law firm and policy firm that is among the groups working with Haugen in the United States and Europe.
In London, Haugen told policymakers that regulation could offset Facebook’s corporate culture that rewards ideas that get people to spend more time scrolling their social media feeds, but views safety issues as a less important “cost centre”.
Facebook’s influence is particularly bad in areas of Africa, Asia and the Middle East where its services are popular but the company does not have language or cultural expertise, Haugen said.
Without government intervention, she told lawmakers, events in countries such as Ethiopia and Myanmar, where Facebook has been accused of contributing to ethnic violence, are the “opening chapters of a novel that is going to be horrific to read”.
She suggested policies that would require Facebook to perform annual risk assessments to identify areas where its product were causing harm – such as the spread of coronavirus misinformation, or harms to teenagers’ mental health. She said Facebook could be required to outline specific solutions and share the findings with outside researchers and auditors to be sure they are sufficient.
Without government-mandated transparency, Facebook can present a false picture of its efforts to address hate speech and other extreme content, she said. The company says artificial intelligence software catches more than 90 per cent of hate speech, but Haugen said the number was less than 5 per cent.
“They are very good at dancing with data,” she said.
British policymakers are drafting a law to create a new internet regulator that could impose billions of dollars worth of fines if more isn’t done to stop the spread of hate speech, misinformation, racist abuse and harmful content targeting children.
The policy ideas gained additional momentum after the murder this month of David Amess, a member of Parliament, leading to calls for the law to force social media companies to crack down on extremism.
In Brussels, Haugen is scheduled to meet on Nov 8 with European Union officials drafting laws that would force Facebook and other large internet platforms to disclose more about how their recommendation algorithms choose to promote certain material over others, and impose tougher antitrust rules to prevent the companies from using their dominant positions to box out smaller rivals.
European policymakers are also debating a ban on targeted advertising based on a person’s data profile, which would pose a grave threat to Facebook’s multibillion-dollar advertising business.
Despite growing political support for new regulation, many questions remain about how such policies would work in practice. Any new laws in Britain and the European Union are not expected to be passed until next year at the earliest.
In the United States, lawmakers are focusing on the harmful effect of Facebook and other social media platforms have on children.
Regulating Facebook is particularly complex because many of its biggest problems center on content posted by users all over the world, raising difficult questions about the regulation of speech and free expression.
In Britain, the new online safety law has been criticised by some civil society groups as being overly restrictive and a threat to free speech online.
Another challenge is how to enforce the new rules, particularly at a time when many government agencies are under pressure to tighten spending.
Join ST’s Telegram channel here and get the latest breaking news delivered to you.
Source: Read Full Article