Friday, 15 Nov 2024

Facebook should take action now as Myanmar's 2020 election is a likely flashpoint: Report

BANGKOK – Myanmar’s general election in 2020 will likely be a flashpoint for hate speech and harassment, and Facebook should prepare now for these drastic scenarios, warns a new report on the impact of the social media giant on human rights.

The independent study, conducted by the non-profit organisation Business for Social Responsibility (BSR), was commissioned by Facebook amid widespread concern that it was being used to harass and incite hate, most drastically against Myanmar’s Muslim Rohingya minority.

Facebook is the primary source of online information in Myanmar, where digital literacy remains low but new mobile phones come pre-installed with the Facebook application.

“The 2020 parliamentary elections are likely to be a flashpoint for hate speech, harassment, misinformation, incitement to violence, and other actions designed to undermine the political process,” said the BSR report released by Facebook on Tuesday Singapore time (Nov 6).

“Today’s challenging circumstances are likely to escalate in the run-up to the election, and Facebook would be well-served by preparing for multiple eventualities now.”

There are also signs that the state surveillance apparatus – which was pervasive during the 50 years of military rule before Myanmar’s political transition in this decade – are re-emerging with the help of social media, the report warned.

“The government, military, and Buddhist nationalist groups are all demonstrating an increasingly sophisticated targeting of civil society leaders, activists, and human rights groups on the Facebook platform in ways that draw upon improved surveillance capabilities,” it said.

Myanmar’s risky online environment is developing amid a tenuous political transition. De facto leader Aung San Suu Kyi, overseeing a civilian government wedded to a military that retains broad powers, is under pressure for her perceived tardiness in reforming the country’s political, legal and economic systems. In a by-election last Saturday, her ruling National League for Democracy party lost three of the seats it held to military-aligned Union Solidarity and Development Party.

In August, Facebook banned 20 individuals and organisations – including military chief Min Aung Hlaing- from its social network to prevent them from using the platform to incite racial or religious tension. This took place as a United Nations fact-finding report called for Myanmar’s military leaders to be investigated for genocide.

In response to the report, Facebook product policy manager Alex Warofka wrote in a blog post: “We know we need to do more to ensure we are a force for good in Myanmar, and in other countries facing their own crises.”

Facebook, which has established a team to work on Myanmar-specific issues, has employed 99 native Burmese-language speakers to review content and plans to hire more human rights specialists, he wrote.

The California-based company, which does not share specific reviewer locations for safety reasons, says it does not have an office in Myanmar due to concerns over government leverage on content and potential risks to its employees.

But Mr Warofka wrote that Facebook has “improved proactive detection of hate speech in Myanmar, and are taking more aggressive action on networks of accounts that are set up to mislead others about who they are, or what they’re doing”.

From August to September, it took action on about 64,000 items of content in Myanmar for violating its hate speech policies. Of this figure, 63 per cent were identified by its staff.

With the help of artificial intelligence, Facebook will also reduce the distribution of posts and comments that contain graphic violence or are dehumanising while they undergo review.

“We also plan to reduce the distribution of individual posts from people and pages in Myanmar who post content that is spammy or sensational,” he wrote.

Source: Read Full Article

Related Posts