Tuesday, 26 Nov 2024

As India votes, false posts and hate speech flummox Facebook

NEW DELHI (NYTIMES)- After a suicide bombing on Feb 14 in the disputed border region of Kashmir, India accused neighbouring Pakistan of harbouring the terrorists who it said had orchestrated the attack. The two countries quickly traded airstrikes.

Online, there was another battle.

One clip that circulated widely on Facebook and other services purported to show an aerial assault by India on an alleged terrorist camp in Pakistan. It was, in fact, taken from a video game.

Photographs of dead bodies wrapped in white, supposedly of Pakistani militants killed in the attack, actually depicted victims of a 2015 heat wave, according to fact checkers. And local news outlets raced to post shreds of “exclusive” information about the hostilities, much of it downright false.

Facebook executives said the deluge was extraordinary.

“I’ve never seen anything like this before – the scale of fake content circulating on one story,” tweeted Mr Trushar Barot, a former BBC journalist who leads the social network’s anti-disinformation efforts in India.

The flood of fake posts gave Facebook a taste of what is to come as India prepares for the world’s biggest election. Prime Minister Narendra Modi and his Bharatiya Janata Party (BJP) are seeking another five years in power, and as many as 879 million people are expected to vote over five weeks starting on April 11.

But as campaigning goes into high gear, Facebook is already struggling to cope with the disinformation and hate speech on its core social network and on WhatsApp, its popular messaging service.

On Monday (March 1), the company said it had removed hundreds of misleading pages and accounts associated with the BJP and its main rival, the Indian National Congress, many of which were publishing false information. Facebook also removed more than 100 fake pages and accounts controlled by the Pakistani military.

India – where the company has 340 million users, more than in any other country – poses distinct challenges.

Posts and videos in more than a dozen languages regularly flummox Facebook’s automated screening software and its human moderators, both of which are built largely around English.

Many problematic posts come directly from candidates, political parties and the media. And on WhatsApp, where messages are encrypted, the company has little visibility into what is being shared.

“India’s elections present a unique set of issues, including a large number of languages and an extended time period for voting,” said Ms Katie Harbath, Facebook’s public policy head for global elections. She said the company had been planning for the election for more than a year.

Other major social media platforms like Twitter and Google’s YouTube are also grappling with false news and hate speech around India’s elections. For the first time, the country’s Election Commission has asked online services to police election-related content from candidates and parties.

But the stakes are especially high for Facebook, which has been under scrutiny since the 2016 US presidential election for distributing misinformation and for being misused by Russian agents to stir discord.

While the company survived the 2018 US midterm elections relatively unscathed, it still found foreign influence networks attempting to use it to sway voters. And Facebook’s services were plagued by misinformation in last year’s Mexican and Brazilian elections.

India is one of Facebook’s most important electoral tests this year, company officials said, along with major elections in places including the Philippines and Indonesia. Facebook’s performance will be a prelude for how it navigates a likely onslaught of propaganda, false information and foreign meddling during the 2020 presidential election in the US.

“The Indian elections in 2019 are an important test case for how they get 2020 right, and how they get elections right everywhere going forward,” said Mr Graham Brookie, director of the Digital Forensic Research Lab at the Atlantic Council, a think tank working with Facebook to study disinformation campaigns.

India has been a laboratory for Facebook’s election efforts since 2014, when the country last chose a parliament. In that contest, Facebook worked closely with Mr Modi’s campaign to target ads and rally fans – part of a global effort to spread the use of the platform in politics. With Facebook’s assistance, Mr Modi became the world’s No 2 most “liked” politician behind former US President Barack Obama, according to campaign officials.

After Mr Modi’s victory, Facebook advised him on how to use its service to govern, including getting government agencies and officials online. In 2015, Mr Mark Zuckerberg, Facebook’s chief executive, hosted Mr Modi for a televised chat at the company’s Silicon Valley headquarters. Facebook held India up as a model for how governments could use the social network.

Facebook has since downplayed its connection to politicians around the world, including Mr Modi, amid a rise in political misinformation.

Mr Ajit Mohan, a former Fox executive who became the social network’s first India chief in January, said, “We are absolutely not affiliated to any political party in India or anywhere else in the world.”

Now all the major Indian parties have sophisticated disinformation strategies, which include posting false and manipulated photos and videos and coordinating posts across a network of paid acolytes and volunteers. That has put Facebook, which has said it does not want to stifle free expression, in an awkward position.

For the past year, the company has relied on two independent organisations – first a local group called Boom and, more recently, the news agency Agence France-Presse – to fact-check a handful of posts in India every day. In February, Facebook added five more organizations to the stable and expanded the number of languages covered to seven, up from just English initially.

Facebook’s algorithms flag potentially fake posts to the fact-checkers, who decide which ones to investigate. After they publish their findings on Facebook, the company said it reduces the visibility of false posts.

Yet some of Facebook’s fact-checkers may themselves be contributing to misinformation. Alt News, an Indian fact-checking site unaffiliated with Facebook, recently found two of the social network’s new partners – the large media houses India Today Group and Jagran Media Network – had repeatedly published false information related to the Kashmir attack.

Mr Pratik Sinha, the founder of Alt News, said Facebook did not seem to view false news as a serious problem. “The whole thing is a PR effort,” he said. Mr Jency Jacob, managing editor of Boom, said Facebook needed to do more, including making it easier for users to report misinformation.

Mr Balkrishna, who heads the fact-checking effort at India Today, said mistakes happen and that the company tries to correct them promptly. Mr Pratyush Ranjan, a senior editor at Jagran New Media, the company’s digital division, said its fact-checking site, Vishvas News, operates separately and follows strict ethics guidelines. Facebook said it relied on the Poynter Institute, an American journalism organisation, to certify its fact-checkers and review their ethical standards.

A separate Facebook effort to assemble a broad coalition of Indian news organisations to combat false news during the election has been delayed amid funding problems and the departure of a key journalist who had worked on it.

The company has also tried to shed more light on political advertising on its site, posting weekly reports on top advertisers and how much they spent. Since Feb 21, when Facebook began tracking the data for India, spending has totaled about 103 million rupees (S$2.02 million).

But the social network has not pushed for transparency on who is really paying for the ads. While the site’s top three political advertisers in India were all associated with the BJP’s election efforts, none of them explicitly disclosed that affiliation to viewers.

Facebook has also had difficulty dealing with hate speech, which is expected to intensify during the election.

For example, Mr Raja Singh, a fiery right-wing Hindu legislator in the southern Indian city of Hyderabad, refers to Muslims as “cow killers” – an inflammatory phrase that has led to some deaths since the country’s Hindu majority considers cows to be sacred. Although Facebook had removed some of Mr Singh’s posts for violating its hate speech rules, at least one threatening rant – in which he calls Muslims “dogs” and talks of cutting off their necks – remained on his official Facebook page for months.

Facebook removed the video, and then deleted Mr Singh’s Facebook page, after The New York Times inquired about it.

In an interview, Mr Singh said he was simply defending Hindus from Muslims. “It is imperative that we threaten them in our own style,” he said.

Ms Thenmozhi Soundararajan, founder of Equality Labs, a human rights group in the US, said her organisation recently studied more than 1,000 Facebook posts that attacked caste and religious minorities.

It found that 80 per cent of the posts stayed on the social network after they were reported as hate speech, and nearly half the posts that were initially removed were up again several months later.

Mr Mohan said Facebook recognises it has more work to do. It recently decided to set up an election “war room” in India to better respond to problems.

“This will evolve and continue to get better,” he said. “There is a tremendous amount of sincerity that we need to get this right.”

Source: Read Full Article

Related Posts