This article is part of David Leonhardt’s newsletter. You can sign up here to receive it each weekday.
After Facebook helped spread misinformation during the 2016 presidential campaign, company executives vowed to do better. They put in place policies meant to reduce the amount of political misinformation on the platform.
They’re not doing a very good job of keeping their word.
Judd Legum, author of the Popular Information newsletter, wrote yesterday that he had “identified hundreds of ads from the Trump [2020 re-election] campaign that violated Facebook’s ad policies. Facebook removed the ads only after I brought them to its attention.”
[Listen to “The Argument” podcast every Thursday morning, with Ross Douthat, Michelle Goldberg and David Leonhardt.]
Many of the ads mention the personal attributes of the person seeing them, such as gender or race. Facebook banned this approach after the 2016 Trump campaign used demographic information in an effort to suppress African-American turnout in swing states.
Yet Legum discovered Team Trump again using dirty tricks. This time, it has falsely put the same quotation in the mouths of two different African-American men portrayed in different ads, one old and one young: “Sir, you have really inspired me and brought back my faith in this great nation. From the bottom of my heart, thank you for all the work you are doing.”
Spreading hate
Other journalists have also discovered that Facebook is failing to abide by its own policies. “The most obvious flaw in Facebook’s new system is that it misses ads it should catch,” ProPublica reported. Among the examples it found: An ad by the Washington State Democratic Party that wasn’t screened, because Facebook’s algorithm didn’t recognize it as a political ad.
And Media Matters, a liberal watchdog group, wrote the following in response to Legum’s new reporting:
“This isn’t the first time Facebook has failed to detect policy violations by advertisers on the platform. In September, Media Matters found a series of ads from right-wing clickbait sites, conspiracy theorists, and extremists which violated Facebook’s policies on false content and discriminatory practices. These ads included: posts from white supremacist Paul Nehlen promoting another white supremacist; anti-Muslim false news; anti-LGBT content; and 9/11 truther, QAnon, and Pizzagate conspiracy theories.”
The central problem seems to be that Facebook is primarily using an artificial-intelligence algorithm to police ads — and the algorithm isn’t very effective.
Perhaps the worst part of the situation is that Facebook is doing little to punish groups that violate its terms. If you break the rules, the company probably won’t catch you, Legum’s reporting suggests. And even if it does, it will merely take down your ad, sometimes after most of your intended audience has already seen it.
As Legum wrote: “What is the point of the Facebook policies if they are not enforced in advance of publication? As it stands, in the unlikely circumstance that you are caught, the only consequence of breaking the rules is being told to fix the issue. There is no incentive to follow the rules in the first place.”
“Facebook,” he added, “is asleep at the wheel.”
Facebook is a powerful force in society today, and with that power comes responsibility. The company needs to do better.
If you are not a subscriber to this newsletter, you can subscribe here. You can also join me on Twitter (@DLeonhardt) and Facebook.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.
David Leonhardt is a former Washington bureau chief for the Times, and was the founding editor of The Upshot and head of The 2020 Project, on the future of the Times newsroom. He won the 2011 Pulitzer Prize for commentary, for columns on the financial crisis. @DLeonhardt • Facebook
Source: Read Full Article
Home » Analysis & Comment » Opinion | Facebook, ‘Asleep at the Wheel’
Opinion | Facebook, ‘Asleep at the Wheel’
This article is part of David Leonhardt’s newsletter. You can sign up here to receive it each weekday.
After Facebook helped spread misinformation during the 2016 presidential campaign, company executives vowed to do better. They put in place policies meant to reduce the amount of political misinformation on the platform.
They’re not doing a very good job of keeping their word.
Judd Legum, author of the Popular Information newsletter, wrote yesterday that he had “identified hundreds of ads from the Trump [2020 re-election] campaign that violated Facebook’s ad policies. Facebook removed the ads only after I brought them to its attention.”
[Listen to “The Argument” podcast every Thursday morning, with Ross Douthat, Michelle Goldberg and David Leonhardt.]
Many of the ads mention the personal attributes of the person seeing them, such as gender or race. Facebook banned this approach after the 2016 Trump campaign used demographic information in an effort to suppress African-American turnout in swing states.
Yet Legum discovered Team Trump again using dirty tricks. This time, it has falsely put the same quotation in the mouths of two different African-American men portrayed in different ads, one old and one young: “Sir, you have really inspired me and brought back my faith in this great nation. From the bottom of my heart, thank you for all the work you are doing.”
Spreading hate
Other journalists have also discovered that Facebook is failing to abide by its own policies. “The most obvious flaw in Facebook’s new system is that it misses ads it should catch,” ProPublica reported. Among the examples it found: An ad by the Washington State Democratic Party that wasn’t screened, because Facebook’s algorithm didn’t recognize it as a political ad.
And Media Matters, a liberal watchdog group, wrote the following in response to Legum’s new reporting:
“This isn’t the first time Facebook has failed to detect policy violations by advertisers on the platform. In September, Media Matters found a series of ads from right-wing clickbait sites, conspiracy theorists, and extremists which violated Facebook’s policies on false content and discriminatory practices. These ads included: posts from white supremacist Paul Nehlen promoting another white supremacist; anti-Muslim false news; anti-LGBT content; and 9/11 truther, QAnon, and Pizzagate conspiracy theories.”
The central problem seems to be that Facebook is primarily using an artificial-intelligence algorithm to police ads — and the algorithm isn’t very effective.
Perhaps the worst part of the situation is that Facebook is doing little to punish groups that violate its terms. If you break the rules, the company probably won’t catch you, Legum’s reporting suggests. And even if it does, it will merely take down your ad, sometimes after most of your intended audience has already seen it.
As Legum wrote: “What is the point of the Facebook policies if they are not enforced in advance of publication? As it stands, in the unlikely circumstance that you are caught, the only consequence of breaking the rules is being told to fix the issue. There is no incentive to follow the rules in the first place.”
“Facebook,” he added, “is asleep at the wheel.”
Facebook is a powerful force in society today, and with that power comes responsibility. The company needs to do better.
If you are not a subscriber to this newsletter, you can subscribe here. You can also join me on Twitter (@DLeonhardt) and Facebook.
Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.
David Leonhardt is a former Washington bureau chief for the Times, and was the founding editor of The Upshot and head of The 2020 Project, on the future of the Times newsroom. He won the 2011 Pulitzer Prize for commentary, for columns on the financial crisis. @DLeonhardt • Facebook
Source: Read Full Article